Apr 21 14:22:58.149754 ip-10-0-138-93 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 14:22:58.149765 ip-10-0-138-93 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 14:22:58.149773 ip-10-0-138-93 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 14:22:58.150004 ip-10-0-138-93 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 14:23:09.595260 ip-10-0-138-93 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 14:23:09.595272 ip-10-0-138-93 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fa51ea59bfc34c9f9418b467312bffd9 -- Apr 21 14:25:20.341082 ip-10-0-138-93 systemd[1]: Starting Kubernetes Kubelet... Apr 21 14:25:20.748587 ip-10-0-138-93 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:20.748587 ip-10-0-138-93 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 14:25:20.748587 ip-10-0-138-93 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:20.748587 ip-10-0-138-93 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 14:25:20.748587 ip-10-0-138-93 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:25:20.750856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.750765 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 14:25:20.753699 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753682 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:20.753699 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753699 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753702 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753706 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753709 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753712 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753716 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753719 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753733 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753737 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753740 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753749 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753753 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753755 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753759 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753761 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753764 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753767 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753769 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753772 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753774 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:20.753782 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753777 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753780 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753782 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753785 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753788 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753792 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753795 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753797 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753800 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753802 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753805 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753807 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753810 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753812 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753815 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753818 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753820 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753823 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753825 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753828 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:20.754271 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753831 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753834 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753837 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753839 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753842 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753844 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753847 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753850 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753852 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753855 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753858 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753860 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753863 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753865 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753868 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753871 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753874 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753876 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753879 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753882 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:20.754814 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753886 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753888 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753892 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753896 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753899 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753902 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753905 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753907 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753910 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753912 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753915 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753920 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753924 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753927 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753930 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753933 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753936 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753939 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:20.755296 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753942 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753945 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753947 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753950 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753953 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753955 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.753958 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754351 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754356 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754359 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754362 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754365 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754368 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754371 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754374 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754376 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754379 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754382 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754384 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754387 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:20.755752 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754390 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754393 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754396 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754398 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754401 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754403 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754406 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754408 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754411 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754414 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754416 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754419 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754422 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754425 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754427 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754430 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754433 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754435 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754438 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754440 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:20.756260 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754444 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754446 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754449 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754452 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754454 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754457 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754460 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754462 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754464 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754467 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754470 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754472 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754475 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754477 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754480 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754482 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754484 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754487 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754489 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754493 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:20.756771 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754495 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754498 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754500 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754503 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754506 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754508 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754511 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754513 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754516 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754519 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754522 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754525 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754528 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754532 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754536 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754539 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754542 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754545 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754547 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754550 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:20.757276 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754552 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754555 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754557 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754562 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754566 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754568 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754571 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754573 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754576 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754578 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754581 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754583 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.754586 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755157 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755168 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755174 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755180 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755185 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755189 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755193 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 14:25:20.757784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755198 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755201 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755205 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755208 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755212 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755215 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755219 2583 flags.go:64] FLAG: --cgroup-root="" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755222 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755225 2583 flags.go:64] FLAG: --client-ca-file="" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755228 2583 flags.go:64] FLAG: --cloud-config="" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755231 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755234 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755239 2583 flags.go:64] FLAG: --cluster-domain="" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755242 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755246 2583 flags.go:64] FLAG: --config-dir="" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755248 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755252 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755256 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755259 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755262 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755266 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755269 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755272 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755276 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755280 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 14:25:20.758276 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755283 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755288 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755291 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755294 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755298 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755301 2583 flags.go:64] FLAG: --enable-server="true" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755304 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755309 2583 flags.go:64] FLAG: --event-burst="100" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755312 2583 flags.go:64] FLAG: --event-qps="50" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755315 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755319 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755322 2583 flags.go:64] FLAG: --eviction-hard="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755326 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755329 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755333 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755336 2583 flags.go:64] FLAG: --eviction-soft="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755339 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755342 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755345 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755348 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755351 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755354 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755357 2583 flags.go:64] FLAG: --feature-gates="" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755360 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755363 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 14:25:20.758914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755367 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755370 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755373 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755376 2583 flags.go:64] FLAG: --help="false" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755379 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-138-93.ec2.internal" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755383 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755387 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755390 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755394 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755397 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755400 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755404 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755407 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755410 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755413 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755416 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755419 2583 flags.go:64] FLAG: --kube-reserved="" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755422 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755425 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755428 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755431 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755434 2583 flags.go:64] FLAG: --lock-file="" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755436 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755439 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 14:25:20.759536 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755442 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755448 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755452 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755455 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755458 2583 flags.go:64] FLAG: --logging-format="text" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755461 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755464 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755467 2583 flags.go:64] FLAG: --manifest-url="" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755470 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755475 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755478 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755482 2583 flags.go:64] FLAG: --max-pods="110" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755485 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755494 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755497 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755500 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755503 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755507 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755510 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755518 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755522 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755525 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755528 2583 flags.go:64] FLAG: --pod-cidr="" Apr 21 14:25:20.760136 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755531 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755538 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755541 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755544 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755547 2583 flags.go:64] FLAG: --port="10250" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755550 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755553 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-075f273a8dc6ee884" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755557 2583 flags.go:64] FLAG: --qos-reserved="" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755560 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755563 2583 flags.go:64] FLAG: --register-node="true" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755566 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755569 2583 flags.go:64] FLAG: --register-with-taints="" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755573 2583 flags.go:64] FLAG: --registry-burst="10" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755576 2583 flags.go:64] FLAG: --registry-qps="5" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755579 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755582 2583 flags.go:64] FLAG: --reserved-memory="" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755586 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755589 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755592 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755595 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755598 2583 flags.go:64] FLAG: --runonce="false" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755601 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755604 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755609 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755612 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755615 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 14:25:20.760692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755618 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755625 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755629 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755632 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755635 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755638 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755641 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755644 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755647 2583 flags.go:64] FLAG: --system-cgroups="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755650 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755656 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755659 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755662 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755667 2583 flags.go:64] FLAG: --tls-min-version="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755670 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755673 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755676 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755679 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755682 2583 flags.go:64] FLAG: --v="2" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755686 2583 flags.go:64] FLAG: --version="false" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755690 2583 flags.go:64] FLAG: --vmodule="" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755695 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.755698 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755806 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:20.761372 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755810 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755815 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755819 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755822 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755825 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755830 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755833 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755836 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755839 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755843 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755847 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755850 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755852 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755855 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755858 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755860 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755863 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755866 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755869 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:20.762040 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755872 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755875 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755878 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755881 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755884 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755887 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755890 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755892 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755895 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755897 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755900 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755902 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755905 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755907 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755910 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755912 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755915 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755917 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755922 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755925 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:20.762556 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755927 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755930 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755934 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755937 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755940 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755943 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755946 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755949 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755951 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755954 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755957 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755959 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755962 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755964 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755967 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755970 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755972 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755975 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755977 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755980 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:20.763130 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755982 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755985 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755987 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755990 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755992 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755995 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.755998 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756000 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756003 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756005 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756009 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756012 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756014 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756017 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756021 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756024 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756027 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756030 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756033 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756035 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:20.763997 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756038 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756040 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756043 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756045 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756048 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.756051 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.756648 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.762874 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.762891 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764065 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764083 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764089 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764094 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764098 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764102 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:20.764696 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764107 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764112 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764117 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764122 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764133 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764137 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764142 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764146 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764151 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764155 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764160 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764165 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764170 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764174 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764178 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764183 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764187 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764196 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764200 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764204 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:20.765119 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764209 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764213 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764217 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764222 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764226 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764231 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764236 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764240 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764245 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764255 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764259 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764264 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764268 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764273 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764278 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764282 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764287 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764292 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764299 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764308 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:20.765620 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764313 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764320 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764333 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764338 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764344 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764350 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764355 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764360 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764366 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764370 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764375 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764380 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764384 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764393 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764397 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764402 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764406 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764411 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764416 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764420 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:20.766131 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764424 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764429 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764433 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764437 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764441 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764446 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764455 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764459 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764464 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764468 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764472 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764477 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764481 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764485 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764490 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764494 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764498 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764503 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764508 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:20.766670 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.764517 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.764526 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765283 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765299 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765304 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765309 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765312 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765315 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765319 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765321 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765324 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765327 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765329 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765332 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765335 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:25:20.767157 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765337 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765340 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765343 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765345 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765348 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765350 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765353 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765356 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765358 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765361 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765364 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765366 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765369 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765372 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765374 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765377 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765380 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765383 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765385 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765389 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:25:20.767541 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765392 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765395 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765398 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765401 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765403 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765406 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765408 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765411 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765413 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765416 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765418 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765422 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765427 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765430 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765432 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765435 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765438 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765441 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765443 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765447 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:25:20.768061 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765449 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765452 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765455 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765457 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765460 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765462 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765465 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765467 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765470 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765473 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765476 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765479 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765482 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765485 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765488 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765491 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765493 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765496 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765499 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765502 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:25:20.768540 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765504 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765507 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765510 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765512 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765515 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765518 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765520 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765523 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765525 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765528 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765531 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765533 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:20.765536 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.765542 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:25:20.769044 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.766867 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 14:25:20.769553 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.769538 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 14:25:20.770294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.770281 2583 server.go:1019] "Starting client certificate rotation" Apr 21 14:25:20.770399 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.770383 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:25:20.770433 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.770425 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:25:20.790658 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.790631 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:25:20.792940 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.792919 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:25:20.806557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.806532 2583 log.go:25] "Validated CRI v1 runtime API" Apr 21 14:25:20.811685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.811666 2583 log.go:25] "Validated CRI v1 image API" Apr 21 14:25:20.814184 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.814164 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 14:25:20.819970 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.819944 2583 fs.go:135] Filesystem UUIDs: map[0694e122-7c25-4ba3-923e-1758a0b2aff6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 92710cab-ac7d-4183-ba2a-c12ed51314cb:/dev/nvme0n1p3] Apr 21 14:25:20.820030 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.819970 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 14:25:20.825749 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.825578 2583 manager.go:217] Machine: {Timestamp:2026-04-21 14:25:20.823912324 +0000 UTC m=+0.373138736 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099692 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f3362a887114e05f6dc38021dc44c SystemUUID:ec2f3362-a887-114e-05f6-dc38021dc44c BootID:fa51ea59-bfc3-4c9f-9418-b467312bffd9 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0a:39:58:b1:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0a:39:58:b1:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:19:de:4b:9a:b4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 14:25:20.825749 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.825718 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 14:25:20.825884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.825838 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:25:20.825884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.825862 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 14:25:20.826830 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.826805 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 14:25:20.826996 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.826831 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-93.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 14:25:20.827046 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.827005 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 14:25:20.827046 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.827015 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 14:25:20.827046 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.827028 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:25:20.827714 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.827704 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:25:20.828781 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.828770 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:25:20.829072 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.829055 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 14:25:20.832518 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.832508 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 21 14:25:20.832559 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.832524 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 14:25:20.832559 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.832539 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 14:25:20.832559 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.832550 2583 kubelet.go:397] "Adding apiserver pod source" Apr 21 14:25:20.832559 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.832558 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 14:25:20.833741 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.833710 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:25:20.833799 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.833746 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:25:20.838019 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.837996 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 14:25:20.839167 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.839154 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 14:25:20.840520 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840509 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 14:25:20.840557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840526 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 14:25:20.840557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840533 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 14:25:20.840557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840541 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 14:25:20.840557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840550 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 14:25:20.840557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840558 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840564 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840569 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840576 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840582 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840591 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 14:25:20.840693 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.840600 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 14:25:20.841375 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.841363 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 14:25:20.841424 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.841378 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 14:25:20.844917 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.844904 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 14:25:20.844975 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.844939 2583 server.go:1295] "Started kubelet" Apr 21 14:25:20.845615 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.845588 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 14:25:20.845717 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.845575 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 14:25:20.845717 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.845655 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 14:25:20.845752 ip-10-0-138-93 systemd[1]: Started Kubernetes Kubelet. Apr 21 14:25:20.848286 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.848124 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-93.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 14:25:20.848286 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.848260 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 21 14:25:20.848442 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.848323 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 14:25:20.848442 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.848348 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 14:25:20.849052 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.849027 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 14:25:20.852047 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.852031 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 14:25:20.852641 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.852624 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 14:25:20.853449 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.853415 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 14:25:20.853449 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.853436 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 14:25:20.853606 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.853594 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 21 14:25:20.853606 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.853605 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 21 14:25:20.853896 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.853877 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 14:25:20.854167 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.853296 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-93.ec2.internal.18a865612170857c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-93.ec2.internal,UID:ip-10-0-138-93.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-93.ec2.internal,},FirstTimestamp:2026-04-21 14:25:20.844916092 +0000 UTC m=+0.394142504,LastTimestamp:2026-04-21 14:25:20.844916092 +0000 UTC m=+0.394142504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-93.ec2.internal,}" Apr 21 14:25:20.854758 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.854714 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:20.855806 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.855784 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 14:25:20.855884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.855851 2583 factory.go:55] Registering systemd factory Apr 21 14:25:20.855884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.855872 2583 factory.go:223] Registration of the systemd container factory successfully Apr 21 14:25:20.856178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.856159 2583 factory.go:153] Registering CRI-O factory Apr 21 14:25:20.856178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.856177 2583 factory.go:223] Registration of the crio container factory successfully Apr 21 14:25:20.856315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.856265 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 14:25:20.856315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.856292 2583 factory.go:103] Registering Raw factory Apr 21 14:25:20.856315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.856304 2583 manager.go:1196] Started watching for new ooms in manager Apr 21 14:25:20.857231 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.857199 2583 manager.go:319] Starting recovery of all containers Apr 21 14:25:20.858175 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.858155 2583 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 14:25:20.858536 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.858517 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 14:25:20.861566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.861533 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 14:25:20.868054 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.867867 2583 manager.go:324] Recovery completed Apr 21 14:25:20.872510 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.872494 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:20.872590 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.872565 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2jpmr" Apr 21 14:25:20.874875 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.874859 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:20.874951 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.874893 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:20.874951 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.874908 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:20.875453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.875441 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 14:25:20.875453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.875453 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 14:25:20.875531 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.875469 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:25:20.877176 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.877113 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-93.ec2.internal.18a865612339af74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-93.ec2.internal,UID:ip-10-0-138-93.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-93.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-93.ec2.internal,},FirstTimestamp:2026-04-21 14:25:20.874876788 +0000 UTC m=+0.424103204,LastTimestamp:2026-04-21 14:25:20.874876788 +0000 UTC m=+0.424103204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-93.ec2.internal,}" Apr 21 14:25:20.878540 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.878526 2583 policy_none.go:49] "None policy: Start" Apr 21 14:25:20.878592 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.878543 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 14:25:20.878592 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.878554 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 21 14:25:20.880286 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.880271 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2jpmr" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919242 2583 manager.go:341] "Starting Device Plugin manager" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.919279 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919290 2583 server.go:85] "Starting device plugin registration server" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919536 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919548 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919638 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919710 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.919718 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.920382 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.920428 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.923142 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.923167 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.923187 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.923194 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:20.923227 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 14:25:20.938032 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:20.926589 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:21.020315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.020239 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:21.021256 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.021234 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:21.021373 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.021269 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:21.021373 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.021279 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:21.021373 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.021305 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.023495 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.023474 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal"] Apr 21 14:25:21.023582 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.023544 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:21.024323 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.024303 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:21.024413 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.024332 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:21.024413 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.024345 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:21.026511 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.026497 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:21.027166 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027150 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.027250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027185 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:21.027250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027187 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:21.027250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027212 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:21.027250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027226 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:21.027801 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027785 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:21.027864 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027812 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:21.027864 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.027822 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:21.029280 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029264 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.029351 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029289 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:25:21.029862 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029838 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.029862 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.029861 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-93.ec2.internal\": node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.030014 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029902 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:25:21.030014 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029927 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:25:21.030014 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.029940 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:25:21.049713 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.049684 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.068789 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.068760 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-93.ec2.internal\" not found" node="ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.073263 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.073242 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-93.ec2.internal\" not found" node="ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.150463 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.150428 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.155850 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.155830 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.155918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.155860 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.155918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.155883 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f5719e4ae571eebc033266ca01f65bf-config\") pod \"kube-apiserver-proxy-ip-10-0-138-93.ec2.internal\" (UID: \"1f5719e4ae571eebc033266ca01f65bf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.251196 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.251170 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.256609 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256589 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.256666 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256622 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.256666 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256645 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f5719e4ae571eebc033266ca01f65bf-config\") pod \"kube-apiserver-proxy-ip-10-0-138-93.ec2.internal\" (UID: \"1f5719e4ae571eebc033266ca01f65bf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.256743 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256687 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.256743 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256689 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/209099c90a25d9687e552a56330f77cb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal\" (UID: \"209099c90a25d9687e552a56330f77cb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.256807 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.256688 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f5719e4ae571eebc033266ca01f65bf-config\") pod \"kube-apiserver-proxy-ip-10-0-138-93.ec2.internal\" (UID: \"1f5719e4ae571eebc033266ca01f65bf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.352050 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.351982 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.370540 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.370514 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.375023 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.375003 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.453126 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.453079 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.553547 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.553514 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.654037 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.653953 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-93.ec2.internal\" not found" Apr 21 14:25:21.721491 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.721463 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:21.753830 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.753801 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.770275 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.770243 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 14:25:21.770421 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.770389 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:25:21.770460 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.770418 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:25:21.770460 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.770433 2583 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a0be2efcfd0564ac38a39b8f64d474bc-7caa1310f31c55cf.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.138.93:40364->54.158.171.253:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.770460 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.770458 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" Apr 21 14:25:21.793229 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.793205 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:25:21.833738 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.833693 2583 apiserver.go:52] "Watching apiserver" Apr 21 14:25:21.839983 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.839954 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 14:25:21.842105 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.842083 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-dgzzz","openshift-dns/node-resolver-fr8hk","openshift-multus/multus-fg4r9","openshift-network-operator/iptables-alerter-m5mcs","openshift-ovn-kubernetes/ovnkube-node-pcn9k","kube-system/konnectivity-agent-jqh94","kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj","openshift-image-registry/node-ca-2gtk9","openshift-multus/multus-additional-cni-plugins-88f5f","openshift-multus/network-metrics-daemon-bcph6","openshift-network-diagnostics/network-check-target-pghpm"] Apr 21 14:25:21.846387 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.846372 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.848796 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.848776 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.848917 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.848824 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fq8sr\"" Apr 21 14:25:21.849156 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.849121 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.849248 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.849168 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.852330 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.850894 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gcqgm\"" Apr 21 14:25:21.852330 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.851138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.852330 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.851218 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.852330 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.851234 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.852592 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.852378 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 14:25:21.853542 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.853517 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.853678 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.853654 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.855057 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.855040 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 14:25:21.855175 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.855118 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.855354 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.855339 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bg7fp\"" Apr 21 14:25:21.855432 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.855339 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 14:25:21.855432 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.855370 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.856156 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.856136 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vn6nz\"" Apr 21 14:25:21.856458 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.856167 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.856458 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.856179 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.856458 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.856282 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.857410 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857388 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.857410 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857403 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.857611 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857430 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 14:25:21.857611 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857449 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 14:25:21.857707 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857615 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 14:25:21.857707 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857691 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 14:25:21.857707 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857700 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-66n5s\"" Apr 21 14:25:21.857869 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.857847 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 14:25:21.858519 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.858504 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.859902 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.859887 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 14:25:21.860055 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860025 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 14:25:21.860160 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860105 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-trsck\"" Apr 21 14:25:21.860228 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860212 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-ovn\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860280 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860280 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860273 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a974b412-8de8-4e89-965b-a82f6e82ccf8-konnectivity-ca\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.860381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860304 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvtf\" (UniqueName: \"kubernetes.io/projected/bdbe70b1-4e89-498e-9026-780d05ec3886-kube-api-access-vgvtf\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.860381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860331 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovn-node-metrics-cert\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860361 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-systemd\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860394 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwlwk\" (UniqueName: \"kubernetes.io/projected/ff78599b-b18f-47bf-83ba-ffa70116ffdd-kube-api-access-lwlwk\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860418 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-tmp\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860435 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2pzh\" (UniqueName: \"kubernetes.io/projected/0e814615-0b94-4de9-9bfd-1b36c817910a-kube-api-access-b2pzh\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860451 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-multus-certs\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860505 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-systemd-units\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860541 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-config\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860568 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e814615-0b94-4de9-9bfd-1b36c817910a-hosts-file\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860592 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-slash\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860614 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-etc-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860638 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e814615-0b94-4de9-9bfd-1b36c817910a-tmp-dir\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860669 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860666 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-socket-dir-parent\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860773 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-netns\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.860808 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860802 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgfff\" (UniqueName: \"kubernetes.io/projected/6d838269-db57-4db4-b003-dabe835d0054-kube-api-access-sgfff\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860828 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-system-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860853 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-multus\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860891 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-log-socket\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860924 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-conf\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860948 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-etc-kubernetes\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860972 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-bin\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.860990 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-netd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861022 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-var-lib-kubelet\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861044 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cni-binary-copy\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861070 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-env-overrides\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861093 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-k8s-cni-cncf-io\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861112 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-hostroot\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861129 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-node-log\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861153 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861145 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-netns\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861186 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-var-lib-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861219 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysconfig\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdbe70b1-4e89-498e-9026-780d05ec3886-host-slash\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861269 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-kubelet\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861293 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a974b412-8de8-4e89-965b-a82f6e82ccf8-agent-certs\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861318 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-run\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861345 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-etc-tuned\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861368 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-daemon-config\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861390 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-systemd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861415 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861438 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-lib-modules\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861483 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-host\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861508 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-kubelet\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861533 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861570 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshcg\" (UniqueName: \"kubernetes.io/projected/e3bb98dc-f964-4937-9c95-4899ca412b4a-kube-api-access-qshcg\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.861822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861593 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-modprobe-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861615 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-bin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861638 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bdbe70b1-4e89-498e-9026-780d05ec3886-iptables-alerter-script\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861664 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-script-lib\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861711 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-kubernetes\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861834 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-sys\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861884 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cnibin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861917 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-os-release\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.862886 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.861950 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-conf-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.863244 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.862951 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.864685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.864670 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.865112 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.865100 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:21.865190 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.865159 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:21.866594 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866570 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-t6cq2\"" Apr 21 14:25:21.866594 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866589 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 14:25:21.866750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866607 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.866750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866613 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 14:25:21.866750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866649 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 14:25:21.866750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866577 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z72t5\"" Apr 21 14:25:21.866750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866618 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 14:25:21.866978 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866962 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 14:25:21.867037 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.866983 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 14:25:21.867085 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.867066 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z4mpj\"" Apr 21 14:25:21.867214 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.867200 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:21.867268 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:21.867253 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:21.878374 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.878353 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:21.878905 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.878889 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:25:21.882084 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.882047 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 14:20:20 +0000 UTC" deadline="2027-10-08 03:08:58.787720925 +0000 UTC" Apr 21 14:25:21.882084 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.882081 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12828h43m36.905642519s" Apr 21 14:25:21.895834 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.895813 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8thvk" Apr 21 14:25:21.912368 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.912341 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8thvk" Apr 21 14:25:21.955427 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.955396 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 14:25:21.962631 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962605 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-systemd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.962631 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962634 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962652 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962674 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-lib-modules\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962698 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-host\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962720 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-kubelet\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962754 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962759 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962804 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-host\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962826 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qshcg\" (UniqueName: \"kubernetes.io/projected/e3bb98dc-f964-4937-9c95-4899ca412b4a-kube-api-access-qshcg\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962829 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962850 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-modprobe-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962857 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-kubelet\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962875 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-bin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962906 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-systemd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.962915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962921 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-bin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962972 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-modprobe-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962980 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-d\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.962809 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-lib-modules\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963043 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bdbe70b1-4e89-498e-9026-780d05ec3886-iptables-alerter-script\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-script-lib\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963086 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-kubernetes\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963142 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-kubernetes\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963166 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-sys\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963195 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cnibin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963199 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-sys\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963226 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-os-release\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963252 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-conf-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963256 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cnibin\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963281 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2m7r\" (UniqueName: \"kubernetes.io/projected/9cd06c0c-27bc-443c-a928-76a45a2b2514-kube-api-access-h2m7r\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963310 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-system-cni-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963314 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-os-release\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-ovn\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.963566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963372 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963387 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-ovn\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963314 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-conf-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963401 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a974b412-8de8-4e89-965b-a82f6e82ccf8-konnectivity-ca\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963443 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvtf\" (UniqueName: \"kubernetes.io/projected/bdbe70b1-4e89-498e-9026-780d05ec3886-kube-api-access-vgvtf\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963471 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-cnibin\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963479 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963528 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963552 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-registration-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963580 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963618 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovn-node-metrics-cert\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-run-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963707 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-script-lib\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963760 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-systemd\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963797 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwlwk\" (UniqueName: \"kubernetes.io/projected/ff78599b-b18f-47bf-83ba-ffa70116ffdd-kube-api-access-lwlwk\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963825 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-systemd\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.964549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963831 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-socket-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963858 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-device-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963886 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a974b412-8de8-4e89-965b-a82f6e82ccf8-konnectivity-ca\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963941 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.963992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-tmp\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964025 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2pzh\" (UniqueName: \"kubernetes.io/projected/0e814615-0b94-4de9-9bfd-1b36c817910a-kube-api-access-b2pzh\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-multus-certs\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cd06c0c-27bc-443c-a928-76a45a2b2514-host\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9cd06c0c-27bc-443c-a928-76a45a2b2514-serviceca\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcgbh\" (UniqueName: \"kubernetes.io/projected/425eadc2-ce6c-4aeb-9856-41d3b15c076b-kube-api-access-rcgbh\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-multus-certs\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964157 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-systemd-units\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964182 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-config\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964209 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e814615-0b94-4de9-9bfd-1b36c817910a-hosts-file\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964238 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879td\" (UniqueName: \"kubernetes.io/projected/71044697-c141-4bb8-a13d-2e24d233501f-kube-api-access-879td\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964279 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-slash\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964305 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-etc-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964331 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e814615-0b94-4de9-9bfd-1b36c817910a-tmp-dir\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.965447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964338 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-systemd-units\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964361 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-socket-dir-parent\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964397 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-netns\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964429 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-binary-copy\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964439 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-socket-dir-parent\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964457 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-sys-fs\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964482 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-netns\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964485 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e814615-0b94-4de9-9bfd-1b36c817910a-hosts-file\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964521 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgfff\" (UniqueName: \"kubernetes.io/projected/6d838269-db57-4db4-b003-dabe835d0054-kube-api-access-sgfff\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964537 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-slash\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964548 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-system-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964576 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-multus\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964601 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e814615-0b94-4de9-9bfd-1b36c817910a-tmp-dir\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964588 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-etc-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964648 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bdbe70b1-4e89-498e-9026-780d05ec3886-iptables-alerter-script\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-system-cni-dir\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.966303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965013 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppj8m\" (UniqueName: \"kubernetes.io/projected/0f955662-1ea1-4f86-a110-4d2d78f023c2-kube-api-access-ppj8m\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965043 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-log-socket\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.964707 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-var-lib-cni-multus\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965079 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-conf\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965098 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovnkube-config\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965108 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-etc-kubernetes\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965118 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-log-socket\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965134 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-bin\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965158 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-netd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965167 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-bin\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965160 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-etc-kubernetes\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-var-lib-kubelet\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965198 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-cni-netd\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965214 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cni-binary-copy\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965189 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysctl-conf\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-env-overrides\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965293 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-var-lib-kubelet\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-k8s-cni-cncf-io\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.966913 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965356 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-hostroot\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965383 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-os-release\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-node-log\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965435 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-netns\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965444 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-hostroot\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965458 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-var-lib-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965489 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff78599b-b18f-47bf-83ba-ffa70116ffdd-host-run-k8s-cni-cncf-io\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965481 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysconfig\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdbe70b1-4e89-498e-9026-780d05ec3886-host-slash\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965534 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-node-log\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965566 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-var-lib-openvswitch\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965579 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-run-netns\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965580 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965607 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3bb98dc-f964-4937-9c95-4899ca412b4a-env-overrides\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965629 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-etc-sysconfig\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bdbe70b1-4e89-498e-9026-780d05ec3886-host-slash\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.967517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965705 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965754 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-kubelet\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965756 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-cni-binary-copy\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965774 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a974b412-8de8-4e89-965b-a82f6e82ccf8-agent-certs\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965789 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-run\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965814 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-etc-tuned\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3bb98dc-f964-4937-9c95-4899ca412b4a-host-kubelet\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965837 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-daemon-config\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.965855 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d838269-db57-4db4-b003-dabe835d0054-run\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.966237 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff78599b-b18f-47bf-83ba-ffa70116ffdd-multus-daemon-config\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.967450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-tmp\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.967589 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3bb98dc-f964-4937-9c95-4899ca412b4a-ovn-node-metrics-cert\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.968243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.968159 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d838269-db57-4db4-b003-dabe835d0054-etc-tuned\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.968648 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.968326 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a974b412-8de8-4e89-965b-a82f6e82ccf8-agent-certs\") pod \"konnectivity-agent-jqh94\" (UID: \"a974b412-8de8-4e89-965b-a82f6e82ccf8\") " pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:21.973554 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.973527 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshcg\" (UniqueName: \"kubernetes.io/projected/e3bb98dc-f964-4937-9c95-4899ca412b4a-kube-api-access-qshcg\") pod \"ovnkube-node-pcn9k\" (UID: \"e3bb98dc-f964-4937-9c95-4899ca412b4a\") " pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:21.973876 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.973844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgfff\" (UniqueName: \"kubernetes.io/projected/6d838269-db57-4db4-b003-dabe835d0054-kube-api-access-sgfff\") pod \"tuned-dgzzz\" (UID: \"6d838269-db57-4db4-b003-dabe835d0054\") " pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:21.973963 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.973885 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvtf\" (UniqueName: \"kubernetes.io/projected/bdbe70b1-4e89-498e-9026-780d05ec3886-kube-api-access-vgvtf\") pod \"iptables-alerter-m5mcs\" (UID: \"bdbe70b1-4e89-498e-9026-780d05ec3886\") " pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:21.974029 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.974003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2pzh\" (UniqueName: \"kubernetes.io/projected/0e814615-0b94-4de9-9bfd-1b36c817910a-kube-api-access-b2pzh\") pod \"node-resolver-fr8hk\" (UID: \"0e814615-0b94-4de9-9bfd-1b36c817910a\") " pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:21.974121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.974102 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwlwk\" (UniqueName: \"kubernetes.io/projected/ff78599b-b18f-47bf-83ba-ffa70116ffdd-kube-api-access-lwlwk\") pod \"multus-fg4r9\" (UID: \"ff78599b-b18f-47bf-83ba-ffa70116ffdd\") " pod="openshift-multus/multus-fg4r9" Apr 21 14:25:21.980537 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:21.980518 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:22.066740 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066702 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2m7r\" (UniqueName: \"kubernetes.io/projected/9cd06c0c-27bc-443c-a928-76a45a2b2514-kube-api-access-h2m7r\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.066877 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-system-cni-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.066877 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066784 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-cnibin\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.066877 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.066877 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066827 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.066877 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066849 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-registration-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066878 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-socket-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-system-cni-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-device-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066927 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-cnibin\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066941 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cd06c0c-27bc-443c-a928-76a45a2b2514-host\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066958 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-registration-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.066967 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9cd06c0c-27bc-443c-a928-76a45a2b2514-serviceca\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067005 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-device-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067009 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcgbh\" (UniqueName: \"kubernetes.io/projected/425eadc2-ce6c-4aeb-9856-41d3b15c076b-kube-api-access-rcgbh\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067043 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cd06c0c-27bc-443c-a928-76a45a2b2514-host\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067045 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-879td\" (UniqueName: \"kubernetes.io/projected/71044697-c141-4bb8-a13d-2e24d233501f-kube-api-access-879td\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067063 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-socket-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067080 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-binary-copy\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-sys-fs\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067135 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067152 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067160 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppj8m\" (UniqueName: \"kubernetes.io/projected/0f955662-1ea1-4f86-a110-4d2d78f023c2-kube-api-access-ppj8m\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067193 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-os-release\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067222 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067247 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067272 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067312 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067382 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-sys-fs\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067412 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067455 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71044697-c141-4bb8-a13d-2e24d233501f-os-release\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067480 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067486 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f955662-1ea1-4f86-a110-4d2d78f023c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.067551 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.067628 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:22.56759727 +0000 UTC m=+2.116823705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9cd06c0c-27bc-443c-a928-76a45a2b2514-serviceca\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.067930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067847 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.068364 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.067958 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71044697-c141-4bb8-a13d-2e24d233501f-cni-binary-copy\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.079015 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.078994 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:22.079015 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.079014 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:22.079175 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.079024 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:22.079175 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.079084 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:22.579070391 +0000 UTC m=+2.128296790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:22.081402 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.081363 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2m7r\" (UniqueName: \"kubernetes.io/projected/9cd06c0c-27bc-443c-a928-76a45a2b2514-kube-api-access-h2m7r\") pod \"node-ca-2gtk9\" (UID: \"9cd06c0c-27bc-443c-a928-76a45a2b2514\") " pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.082388 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.082365 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcgbh\" (UniqueName: \"kubernetes.io/projected/425eadc2-ce6c-4aeb-9856-41d3b15c076b-kube-api-access-rcgbh\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:22.082866 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.082843 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppj8m\" (UniqueName: \"kubernetes.io/projected/0f955662-1ea1-4f86-a110-4d2d78f023c2-kube-api-access-ppj8m\") pod \"aws-ebs-csi-driver-node-w54nj\" (UID: \"0f955662-1ea1-4f86-a110-4d2d78f023c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.083199 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.083180 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-879td\" (UniqueName: \"kubernetes.io/projected/71044697-c141-4bb8-a13d-2e24d233501f-kube-api-access-879td\") pod \"multus-additional-cni-plugins-88f5f\" (UID: \"71044697-c141-4bb8-a13d-2e24d233501f\") " pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.092368 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.092307 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f5719e4ae571eebc033266ca01f65bf.slice/crio-cb87cfe8b0ff63e57b5e50921d0be4aa2bddb619aeb15ef3d4ae8721892b9864 WatchSource:0}: Error finding container cb87cfe8b0ff63e57b5e50921d0be4aa2bddb619aeb15ef3d4ae8721892b9864: Status 404 returned error can't find the container with id cb87cfe8b0ff63e57b5e50921d0be4aa2bddb619aeb15ef3d4ae8721892b9864 Apr 21 14:25:22.092838 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.092741 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod209099c90a25d9687e552a56330f77cb.slice/crio-54288e0f4f00e321029f3aa84e90a62a05c9451569e08cb5d90bc014feab908e WatchSource:0}: Error finding container 54288e0f4f00e321029f3aa84e90a62a05c9451569e08cb5d90bc014feab908e: Status 404 returned error can't find the container with id 54288e0f4f00e321029f3aa84e90a62a05c9451569e08cb5d90bc014feab908e Apr 21 14:25:22.097947 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.097932 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:25:22.176755 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.176662 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" Apr 21 14:25:22.182456 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.182433 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d838269_db57_4db4_b003_dabe835d0054.slice/crio-b67e34f0f408ce0762cb2e0048a9fb26f13f7801eb298660cfc4dc4cd0eb7c28 WatchSource:0}: Error finding container b67e34f0f408ce0762cb2e0048a9fb26f13f7801eb298660cfc4dc4cd0eb7c28: Status 404 returned error can't find the container with id b67e34f0f408ce0762cb2e0048a9fb26f13f7801eb298660cfc4dc4cd0eb7c28 Apr 21 14:25:22.187185 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.187167 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fr8hk" Apr 21 14:25:22.193129 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.193095 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e814615_0b94_4de9_9bfd_1b36c817910a.slice/crio-56c398707e9fadcde2d1c05865f454fc22d02c74282d889d6b68cb38254611b5 WatchSource:0}: Error finding container 56c398707e9fadcde2d1c05865f454fc22d02c74282d889d6b68cb38254611b5: Status 404 returned error can't find the container with id 56c398707e9fadcde2d1c05865f454fc22d02c74282d889d6b68cb38254611b5 Apr 21 14:25:22.201444 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.201426 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fg4r9" Apr 21 14:25:22.207265 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.207239 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff78599b_b18f_47bf_83ba_ffa70116ffdd.slice/crio-1243d7b5a5b149c741d760182a07f9e9a2b77b394dce47fb80b89b8ff677bef5 WatchSource:0}: Error finding container 1243d7b5a5b149c741d760182a07f9e9a2b77b394dce47fb80b89b8ff677bef5: Status 404 returned error can't find the container with id 1243d7b5a5b149c741d760182a07f9e9a2b77b394dce47fb80b89b8ff677bef5 Apr 21 14:25:22.220369 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.220340 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m5mcs" Apr 21 14:25:22.226328 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.226299 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbe70b1_4e89_498e_9026_780d05ec3886.slice/crio-197e2f67ce5125f14fd7421af4a6a8289ce043e1fd3f9fa8dcab76f7eb211f27 WatchSource:0}: Error finding container 197e2f67ce5125f14fd7421af4a6a8289ce043e1fd3f9fa8dcab76f7eb211f27: Status 404 returned error can't find the container with id 197e2f67ce5125f14fd7421af4a6a8289ce043e1fd3f9fa8dcab76f7eb211f27 Apr 21 14:25:22.237631 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.237608 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:22.243292 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.243266 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bb98dc_f964_4937_9c95_4899ca412b4a.slice/crio-154e3ec34bacd1d801d7278fdf13b3555361fae347265e637a84f27bf4bac9df WatchSource:0}: Error finding container 154e3ec34bacd1d801d7278fdf13b3555361fae347265e637a84f27bf4bac9df: Status 404 returned error can't find the container with id 154e3ec34bacd1d801d7278fdf13b3555361fae347265e637a84f27bf4bac9df Apr 21 14:25:22.261999 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.261973 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:22.268324 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.268299 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda974b412_8de8_4e89_965b_a82f6e82ccf8.slice/crio-d8c94fdd62ac322637a76247ce5b37c0ebed9eee17280fad579bc3dfa9df9dc6 WatchSource:0}: Error finding container d8c94fdd62ac322637a76247ce5b37c0ebed9eee17280fad579bc3dfa9df9dc6: Status 404 returned error can't find the container with id d8c94fdd62ac322637a76247ce5b37c0ebed9eee17280fad579bc3dfa9df9dc6 Apr 21 14:25:22.286919 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.286895 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" Apr 21 14:25:22.293383 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.293352 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f955662_1ea1_4f86_a110_4d2d78f023c2.slice/crio-66e565f2462546bac78972989124275ba5e119e118e9aa36bbf8754a00956b4a WatchSource:0}: Error finding container 66e565f2462546bac78972989124275ba5e119e118e9aa36bbf8754a00956b4a: Status 404 returned error can't find the container with id 66e565f2462546bac78972989124275ba5e119e118e9aa36bbf8754a00956b4a Apr 21 14:25:22.303676 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.303654 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2gtk9" Apr 21 14:25:22.309302 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.309272 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88f5f" Apr 21 14:25:22.316236 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:25:22.316212 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71044697_c141_4bb8_a13d_2e24d233501f.slice/crio-3d9c905a277970bb4beb4bb9d8e50a7e706421cfd9dd22624dcbbffb9fa8cdf3 WatchSource:0}: Error finding container 3d9c905a277970bb4beb4bb9d8e50a7e706421cfd9dd22624dcbbffb9fa8cdf3: Status 404 returned error can't find the container with id 3d9c905a277970bb4beb4bb9d8e50a7e706421cfd9dd22624dcbbffb9fa8cdf3 Apr 21 14:25:22.572047 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.571929 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:22.572047 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.572026 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:22.572234 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.572087 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:23.572071833 +0000 UTC m=+3.121298232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:22.673084 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.673032 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:22.673279 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.673225 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:22.673279 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.673252 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:22.673279 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.673266 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:22.673451 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:22.673337 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:23.673313713 +0000 UTC m=+3.222540132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:22.914562 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.914464 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:20:21 +0000 UTC" deadline="2027-10-17 19:22:35.732344013 +0000 UTC" Apr 21 14:25:22.914562 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.914510 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13060h57m12.817838341s" Apr 21 14:25:22.930666 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.930603 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" event={"ID":"6d838269-db57-4db4-b003-dabe835d0054","Type":"ContainerStarted","Data":"b67e34f0f408ce0762cb2e0048a9fb26f13f7801eb298660cfc4dc4cd0eb7c28"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.932230 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" event={"ID":"1f5719e4ae571eebc033266ca01f65bf","Type":"ContainerStarted","Data":"cb87cfe8b0ff63e57b5e50921d0be4aa2bddb619aeb15ef3d4ae8721892b9864"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.933870 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerStarted","Data":"3d9c905a277970bb4beb4bb9d8e50a7e706421cfd9dd22624dcbbffb9fa8cdf3"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.936005 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2gtk9" event={"ID":"9cd06c0c-27bc-443c-a928-76a45a2b2514","Type":"ContainerStarted","Data":"614d1f78acc28684860e2842c4137d046c7b43e2d8fd1fca21b36669bfc0d5d4"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.937389 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" event={"ID":"0f955662-1ea1-4f86-a110-4d2d78f023c2","Type":"ContainerStarted","Data":"66e565f2462546bac78972989124275ba5e119e118e9aa36bbf8754a00956b4a"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.940756 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jqh94" event={"ID":"a974b412-8de8-4e89-965b-a82f6e82ccf8","Type":"ContainerStarted","Data":"d8c94fdd62ac322637a76247ce5b37c0ebed9eee17280fad579bc3dfa9df9dc6"} Apr 21 14:25:22.945040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.944055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"154e3ec34bacd1d801d7278fdf13b3555361fae347265e637a84f27bf4bac9df"} Apr 21 14:25:22.956401 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.956363 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m5mcs" event={"ID":"bdbe70b1-4e89-498e-9026-780d05ec3886","Type":"ContainerStarted","Data":"197e2f67ce5125f14fd7421af4a6a8289ce043e1fd3f9fa8dcab76f7eb211f27"} Apr 21 14:25:22.960692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.960656 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" event={"ID":"209099c90a25d9687e552a56330f77cb","Type":"ContainerStarted","Data":"54288e0f4f00e321029f3aa84e90a62a05c9451569e08cb5d90bc014feab908e"} Apr 21 14:25:22.967022 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.966985 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fg4r9" event={"ID":"ff78599b-b18f-47bf-83ba-ffa70116ffdd","Type":"ContainerStarted","Data":"1243d7b5a5b149c741d760182a07f9e9a2b77b394dce47fb80b89b8ff677bef5"} Apr 21 14:25:22.974194 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:22.974147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fr8hk" event={"ID":"0e814615-0b94-4de9-9bfd-1b36c817910a","Type":"ContainerStarted","Data":"56c398707e9fadcde2d1c05865f454fc22d02c74282d889d6b68cb38254611b5"} Apr 21 14:25:23.259951 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.259873 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:23.580180 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.579073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:23.580180 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.579244 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:23.580180 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.579305 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:25.579287451 +0000 UTC m=+5.128513855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:23.680501 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.680293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:23.680501 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.680506 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:23.680966 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.680525 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:23.680966 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.680537 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:23.680966 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.680594 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:25.680575329 +0000 UTC m=+5.229801729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:23.916441 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.916316 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:20:21 +0000 UTC" deadline="2028-01-26 04:03:12.002118284 +0000 UTC" Apr 21 14:25:23.916441 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.916356 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15469h37m48.0857661s" Apr 21 14:25:23.924145 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.924109 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:23.924327 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.924243 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:23.924739 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:23.924708 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:23.924873 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:23.924848 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:24.070090 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:24.070051 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:25:25.597313 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:25.597262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:25.597949 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.597468 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:25.597949 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.597529 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:29.597510279 +0000 UTC m=+9.146736681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:25.698354 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:25.698315 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:25.698528 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.698505 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:25.698674 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.698533 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:25.698674 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.698548 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:25.698674 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.698616 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:29.698597635 +0000 UTC m=+9.247824043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:25.924408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:25.924327 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:25.924408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:25.924362 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:25.924578 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.924474 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:25.924656 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:25.924602 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:27.924703 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:27.924283 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:27.924703 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:27.924411 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:27.924703 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:27.924548 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:27.924703 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:27.924645 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:29.631142 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:29.631100 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:29.631548 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.631273 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:29.631548 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.631357 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:37.631335767 +0000 UTC m=+17.180562170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:29.731899 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:29.731852 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:29.732071 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.732040 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:29.732071 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.732060 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:29.732199 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.732073 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:29.732199 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.732131 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:37.73211287 +0000 UTC m=+17.281339272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:29.924357 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:29.924272 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:29.924357 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:29.924339 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:29.924570 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.924460 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:29.924624 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:29.924592 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:31.923744 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:31.923701 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:31.924269 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:31.923768 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:31.924269 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:31.923855 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:31.924269 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:31.924022 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:33.923772 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:33.923714 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:33.923772 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:33.923768 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:33.924219 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:33.923849 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:33.924219 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:33.923971 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:35.924281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:35.924249 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:35.924281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:35.924266 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:35.924764 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:35.924357 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:35.924764 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:35.924504 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:36.123761 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.123658 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-66rqj"] Apr 21 14:25:36.127973 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.127947 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.128133 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:36.128032 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:36.177852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.177805 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-kubelet-config\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.178048 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.177864 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-dbus\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.178048 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.177991 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.279297 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.279262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.279478 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.279323 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-kubelet-config\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.279478 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.279353 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-dbus\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.279478 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:36.279440 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:36.279478 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.279450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-kubelet-config\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.279698 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:36.279516 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:36.77949529 +0000 UTC m=+16.328721701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:36.279698 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.279535 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fddef6cc-f623-494d-af92-4fd8811401a4-dbus\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.782222 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:36.782178 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:36.782420 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:36.782289 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:36.782420 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:36.782365 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:37.782345502 +0000 UTC m=+17.331571902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:37.687713 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.687672 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:37.688215 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.687859 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:37.688215 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.687940 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.687919191 +0000 UTC m=+33.237145609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:37.788776 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.788744 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:37.788776 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.788783 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.788910 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.788928 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.788950 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.788964 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.788969 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:39.788950171 +0000 UTC m=+19.338176570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:37.789019 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.789011 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.788997971 +0000 UTC m=+33.338224371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:37.924255 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.924218 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:37.924461 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.924218 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:37.924461 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.924342 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:37.924461 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.924448 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:37.924631 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:37.924218 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:37.924631 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:37.924561 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:39.801126 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:39.801075 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:39.801608 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:39.801196 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:39.801608 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:39.801247 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:43.801234589 +0000 UTC m=+23.350460987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:39.924196 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:39.924162 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:39.924383 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:39.924161 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:39.924383 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:39.924287 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:39.924383 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:39.924367 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:39.924515 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:39.924161 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:39.924565 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:39.924518 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:41.021331 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.020757 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fg4r9" event={"ID":"ff78599b-b18f-47bf-83ba-ffa70116ffdd","Type":"ContainerStarted","Data":"c4f428bcd5cbcdc6d605e5e3c7b85d72b19b1de8935ca873372151b371d635c7"} Apr 21 14:25:41.023017 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.022988 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" event={"ID":"6d838269-db57-4db4-b003-dabe835d0054","Type":"ContainerStarted","Data":"80dfe7d899e8f4069bd2a81169aa1af15807517ea8f72b20db5df278a7e43af3"} Apr 21 14:25:41.026082 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.026053 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" event={"ID":"1f5719e4ae571eebc033266ca01f65bf","Type":"ContainerStarted","Data":"c5982dc7c83ecc84aaffb4c20090eb554a0a83cc708fe9661d215e7ddb2d9425"} Apr 21 14:25:41.030556 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030537 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:25:41.030853 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030834 2583 generic.go:358] "Generic (PLEG): container finished" podID="e3bb98dc-f964-4937-9c95-4899ca412b4a" containerID="54b5af9cbb60484a8fd01a231f508833de7a6d287b9575f8eb9cfca8e17e47a5" exitCode=1 Apr 21 14:25:41.030922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030870 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"1d51211fc95a7fa2cf70014e55b42df8bbf5c44f219cac8224992eb6a4331667"} Apr 21 14:25:41.030922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030891 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"b92046c4fc8f44a4caf6d7220a646472425ff6aade58c2959e76ef6307d51493"} Apr 21 14:25:41.030922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030903 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"a0728e3db7505f304d46178ac751534562c24bd1f8c6a369b568d8c4030c60b2"} Apr 21 14:25:41.030922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030913 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"43d8ecdeeb787d55dbc824860e1e305c2133533d5e323e63299866cbe2cda47c"} Apr 21 14:25:41.030922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerDied","Data":"54b5af9cbb60484a8fd01a231f508833de7a6d287b9575f8eb9cfca8e17e47a5"} Apr 21 14:25:41.031071 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.030930 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"7ddf8ffca54f7fb5c90a702cc46979efacaaf71ec99e6328ab264775b798c7f7"} Apr 21 14:25:41.037205 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.037146 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fg4r9" podStartSLOduration=1.797056076 podStartE2EDuration="20.037130976s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.208913224 +0000 UTC m=+1.758139627" lastFinishedPulling="2026-04-21 14:25:40.448988113 +0000 UTC m=+19.998214527" observedRunningTime="2026-04-21 14:25:41.03593719 +0000 UTC m=+20.585163624" watchObservedRunningTime="2026-04-21 14:25:41.037130976 +0000 UTC m=+20.586357398" Apr 21 14:25:41.070381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.070334 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dgzzz" podStartSLOduration=1.8192739850000001 podStartE2EDuration="20.070318452s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.183906378 +0000 UTC m=+1.733132777" lastFinishedPulling="2026-04-21 14:25:40.434950846 +0000 UTC m=+19.984177244" observedRunningTime="2026-04-21 14:25:41.052157497 +0000 UTC m=+20.601383917" watchObservedRunningTime="2026-04-21 14:25:41.070318452 +0000 UTC m=+20.619544872" Apr 21 14:25:41.923618 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.923574 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:41.923816 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.923574 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:41.923816 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:41.923710 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:41.923816 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:41.923574 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:41.923983 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:41.923813 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:41.923983 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:41.923881 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:42.034916 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.034876 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="851fb8c26fe169d69048d24b2643115e9e30d24d599dd23792a102af9984a079" exitCode=0 Apr 21 14:25:42.035322 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.034966 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"851fb8c26fe169d69048d24b2643115e9e30d24d599dd23792a102af9984a079"} Apr 21 14:25:42.036461 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.036363 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2gtk9" event={"ID":"9cd06c0c-27bc-443c-a928-76a45a2b2514","Type":"ContainerStarted","Data":"b0dbe34ac3e271dd15c35329126b20df65b1907c6d821181e35d88fd4b11dd6c"} Apr 21 14:25:42.038028 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.037947 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" event={"ID":"0f955662-1ea1-4f86-a110-4d2d78f023c2","Type":"ContainerStarted","Data":"d0d7dd3367ef1d6a83413577b2477d918cfe5beb9b567406f1c2ec8a643d4b0e"} Apr 21 14:25:42.039381 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.039356 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jqh94" event={"ID":"a974b412-8de8-4e89-965b-a82f6e82ccf8","Type":"ContainerStarted","Data":"75252d77f17e0adb2921f2a2cc7aab58c9708dd6fbd95342e846e515afab273e"} Apr 21 14:25:42.040785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.040758 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m5mcs" event={"ID":"bdbe70b1-4e89-498e-9026-780d05ec3886","Type":"ContainerStarted","Data":"0a1f3727d163900f6a9d3928188e5c4b8dee271da3ebb54286b6077660d1a9c3"} Apr 21 14:25:42.042289 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.042261 2583 generic.go:358] "Generic (PLEG): container finished" podID="209099c90a25d9687e552a56330f77cb" containerID="c595d4744afb6d9a3bbb13b7f90ff3bfd32ff62a5b30bbcd2847cab4f71085bb" exitCode=0 Apr 21 14:25:42.042385 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.042332 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" event={"ID":"209099c90a25d9687e552a56330f77cb","Type":"ContainerDied","Data":"c595d4744afb6d9a3bbb13b7f90ff3bfd32ff62a5b30bbcd2847cab4f71085bb"} Apr 21 14:25:42.042487 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.042467 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" Apr 21 14:25:42.043577 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.043551 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fr8hk" event={"ID":"0e814615-0b94-4de9-9bfd-1b36c817910a","Type":"ContainerStarted","Data":"25889b4fb29f1b9012956d41b8a311488db391c7906eb8a27ef576bb0056e42b"} Apr 21 14:25:42.052613 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.052593 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:25:42.053426 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.053387 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-93.ec2.internal" podStartSLOduration=21.053372911 podStartE2EDuration="21.053372911s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:25:41.070852435 +0000 UTC m=+20.620078857" watchObservedRunningTime="2026-04-21 14:25:42.053372911 +0000 UTC m=+21.602599334" Apr 21 14:25:42.053602 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.053586 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal"] Apr 21 14:25:42.071447 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.071389 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m5mcs" podStartSLOduration=2.860665838 podStartE2EDuration="21.071374083s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.22781326 +0000 UTC m=+1.777039663" lastFinishedPulling="2026-04-21 14:25:40.438521492 +0000 UTC m=+19.987747908" observedRunningTime="2026-04-21 14:25:42.071050615 +0000 UTC m=+21.620277038" watchObservedRunningTime="2026-04-21 14:25:42.071374083 +0000 UTC m=+21.620600508" Apr 21 14:25:42.084618 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.084576 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jqh94" podStartSLOduration=2.950696348 podStartE2EDuration="21.084563314s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.271230382 +0000 UTC m=+1.820456784" lastFinishedPulling="2026-04-21 14:25:40.405097337 +0000 UTC m=+19.954323750" observedRunningTime="2026-04-21 14:25:42.084176713 +0000 UTC m=+21.633403133" watchObservedRunningTime="2026-04-21 14:25:42.084563314 +0000 UTC m=+21.633789714" Apr 21 14:25:42.097057 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.096960 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2gtk9" podStartSLOduration=3.002757248 podStartE2EDuration="21.096947086s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.311251649 +0000 UTC m=+1.860478048" lastFinishedPulling="2026-04-21 14:25:40.405441483 +0000 UTC m=+19.954667886" observedRunningTime="2026-04-21 14:25:42.096887948 +0000 UTC m=+21.646114371" watchObservedRunningTime="2026-04-21 14:25:42.096947086 +0000 UTC m=+21.646173506" Apr 21 14:25:42.123235 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.123185 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fr8hk" podStartSLOduration=2.912961472 podStartE2EDuration="21.12316918s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.194800215 +0000 UTC m=+1.744026614" lastFinishedPulling="2026-04-21 14:25:40.405007923 +0000 UTC m=+19.954234322" observedRunningTime="2026-04-21 14:25:42.122282332 +0000 UTC m=+21.671508753" watchObservedRunningTime="2026-04-21 14:25:42.12316918 +0000 UTC m=+21.672395602" Apr 21 14:25:42.389959 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.389764 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 14:25:42.930289 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.930176 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T14:25:42.389954351Z","UUID":"4fb456d4-433c-4c79-ac07-25d0421e45a0","Handler":null,"Name":"","Endpoint":""} Apr 21 14:25:42.933684 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.933654 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 14:25:42.933684 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:42.933684 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 14:25:43.049862 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.049828 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" event={"ID":"0f955662-1ea1-4f86-a110-4d2d78f023c2","Type":"ContainerStarted","Data":"1e5cad95c5414443fe5d3f4929a4e1c4052e9968fd788b22af8bf6c8d2a1dc36"} Apr 21 14:25:43.051715 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.051689 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" event={"ID":"209099c90a25d9687e552a56330f77cb","Type":"ContainerStarted","Data":"cd2158df325688cddd970ebfc877df9b20f5bc75cdcca1ed4e4cfba6bf1c163f"} Apr 21 14:25:43.068826 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.068779 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-93.ec2.internal" podStartSLOduration=1.068763816 podStartE2EDuration="1.068763816s" podCreationTimestamp="2026-04-21 14:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:25:43.068229602 +0000 UTC m=+22.617456024" watchObservedRunningTime="2026-04-21 14:25:43.068763816 +0000 UTC m=+22.617990238" Apr 21 14:25:43.830911 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.830810 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:43.831085 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:43.830987 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:43.831085 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:43.831068 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:51.831047822 +0000 UTC m=+31.380274233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:43.923688 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.923654 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:43.923888 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.923655 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:43.923888 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:43.923792 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:43.923888 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:43.923655 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:43.924038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:43.923884 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:43.924038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:43.923917 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:44.055458 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.055415 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" event={"ID":"0f955662-1ea1-4f86-a110-4d2d78f023c2","Type":"ContainerStarted","Data":"817fdeeeacef29d7dfc945c5b93405a5ac052c1a1b814a6501c7deb4839403b8"} Apr 21 14:25:44.058404 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.058374 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:25:44.058908 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.058875 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"6f69e4f04c6e2dfa2e169a0dd7b0c2abde0acb176b4dab3f0bef1e8cda02ee1d"} Apr 21 14:25:44.073874 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.073807 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w54nj" podStartSLOduration=1.869395734 podStartE2EDuration="23.073787058s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.294840911 +0000 UTC m=+1.844067315" lastFinishedPulling="2026-04-21 14:25:43.499232239 +0000 UTC m=+23.048458639" observedRunningTime="2026-04-21 14:25:44.073092692 +0000 UTC m=+23.622319114" watchObservedRunningTime="2026-04-21 14:25:44.073787058 +0000 UTC m=+23.623013481" Apr 21 14:25:44.323274 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.323190 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:44.323928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:44.323895 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:45.061315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:45.061282 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:45.061912 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:45.061891 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jqh94" Apr 21 14:25:45.923707 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:45.923684 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:45.923834 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:45.923691 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:45.923834 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:45.923691 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:45.923834 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:45.923812 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:45.923979 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:45.923883 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:45.923979 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:45.923954 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:46.068845 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.068648 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:25:46.069286 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.069229 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"48a379d8ba357e83860cd774f8cb0bbce5bd55329abb1767f65eaa8a72dfe4c2"} Apr 21 14:25:46.069736 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.069699 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:46.069849 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.069744 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:46.069902 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.069866 2583 scope.go:117] "RemoveContainer" containerID="54b5af9cbb60484a8fd01a231f508833de7a6d287b9575f8eb9cfca8e17e47a5" Apr 21 14:25:46.084099 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:46.084078 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:47.073295 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.073262 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="1251791a6faa2db7d2056a0bfb070ba26524a0f27d64a4ff08085074ea2548d8" exitCode=0 Apr 21 14:25:47.073743 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.073352 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"1251791a6faa2db7d2056a0bfb070ba26524a0f27d64a4ff08085074ea2548d8"} Apr 21 14:25:47.076379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.076358 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:25:47.076751 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.076704 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" event={"ID":"e3bb98dc-f964-4937-9c95-4899ca412b4a","Type":"ContainerStarted","Data":"2b33464fbc783b59d334660dc4e4f676711b56c75a305d38995aca94af4cc4ab"} Apr 21 14:25:47.076928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.076908 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:47.091787 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.091758 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:25:47.121567 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.121519 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" podStartSLOduration=7.87016866 podStartE2EDuration="26.121504887s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.245012163 +0000 UTC m=+1.794238568" lastFinishedPulling="2026-04-21 14:25:40.496348377 +0000 UTC m=+20.045574795" observedRunningTime="2026-04-21 14:25:47.121048796 +0000 UTC m=+26.670275215" watchObservedRunningTime="2026-04-21 14:25:47.121504887 +0000 UTC m=+26.670731307" Apr 21 14:25:47.923555 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.923523 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:47.923555 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.923543 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:47.923815 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:47.923523 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:47.923815 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:47.923634 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:47.923815 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:47.923683 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:47.923815 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:47.923776 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:48.172454 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.172411 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcph6"] Apr 21 14:25:48.172991 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.172527 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:48.172991 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:48.172626 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:48.175369 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.175314 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66rqj"] Apr 21 14:25:48.175480 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.175392 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:48.175480 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:48.175473 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:48.177624 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.177600 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pghpm"] Apr 21 14:25:48.177710 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:48.177676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:48.177785 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:48.177766 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:49.082084 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:49.081884 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="43257c47c2fa577839f95ffd8c7861e2d51b20438019b79f38bb80a201498179" exitCode=0 Apr 21 14:25:49.082264 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:49.081968 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"43257c47c2fa577839f95ffd8c7861e2d51b20438019b79f38bb80a201498179"} Apr 21 14:25:49.924294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:49.924211 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:49.924294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:49.924259 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:49.924888 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:49.924322 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:49.924888 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:49.924376 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:49.924888 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:49.924399 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:49.924888 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:49.924457 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:51.087765 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.087660 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="1f39faccfc45dda845df18ea12b6da177678fefbb6a20b62f60c6322f782991e" exitCode=0 Apr 21 14:25:51.087765 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.087706 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"1f39faccfc45dda845df18ea12b6da177678fefbb6a20b62f60c6322f782991e"} Apr 21 14:25:51.890950 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.890902 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:51.891132 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:51.891071 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:51.891196 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:51.891144 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret podName:fddef6cc-f623-494d-af92-4fd8811401a4 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:07.891126758 +0000 UTC m=+47.440353157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret") pod "global-pull-secret-syncer-66rqj" (UID: "fddef6cc-f623-494d-af92-4fd8811401a4") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:25:51.923390 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.923356 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:51.923580 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.923362 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:51.923580 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:51.923485 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pghpm" podUID="098e6bc9-3c71-4b00-84be-72f47d753e5a" Apr 21 14:25:51.923580 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:51.923357 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:51.923580 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:51.923547 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-66rqj" podUID="fddef6cc-f623-494d-af92-4fd8811401a4" Apr 21 14:25:51.923851 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:51.923648 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:25:52.316843 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.316811 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-93.ec2.internal" event="NodeReady" Apr 21 14:25:52.317322 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.316961 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 14:25:52.357475 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.357438 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5"] Apr 21 14:25:52.376516 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.375868 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:25:52.376516 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.376361 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.379239 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.379212 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 14:25:52.379400 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.379296 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-j7vqv\"" Apr 21 14:25:52.379624 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.379605 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 14:25:52.391029 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.391001 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5"] Apr 21 14:25:52.391178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.391035 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-flqfn"] Apr 21 14:25:52.391178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.391163 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.395427 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.395239 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4v5jp\"" Apr 21 14:25:52.397419 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.397308 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 14:25:52.398624 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.398245 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 14:25:52.398894 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.398848 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 14:25:52.402243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.402151 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 14:25:52.413351 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.413327 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:25:52.413460 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.413357 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flqfn"] Apr 21 14:25:52.413503 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.413491 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.416444 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.416426 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 14:25:52.416676 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.416658 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 14:25:52.416676 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.416669 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-88srx\"" Apr 21 14:25:52.490178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.490142 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9pxm9"] Apr 21 14:25:52.498197 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498167 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498202 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzz5b\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498232 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.498361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e1883f5-d573-490b-964d-821444217181-tmp-dir\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.498504 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498504 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498416 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e1883f5-d573-490b-964d-821444217181-config-volume\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.498504 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498460 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw8w\" (UniqueName: \"kubernetes.io/projected/1e1883f5-d573-490b-964d-821444217181-kube-api-access-2gw8w\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.498504 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498489 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498677 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.498677 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498551 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498677 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498588 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498677 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498672 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498837 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.498837 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.498775 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.515029 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.514995 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9pxm9"] Apr 21 14:25:52.515201 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.515133 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:52.517702 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.517679 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 14:25:52.517865 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.517678 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pqf56\"" Apr 21 14:25:52.517942 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.517879 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 14:25:52.518024 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.518011 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 14:25:52.599806 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.599773 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e1883f5-d573-490b-964d-821444217181-tmp-dir\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.599806 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.599810 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600062 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.599845 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e1883f5-d573-490b-964d-821444217181-config-volume\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.600062 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.599964 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:25:52.600062 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.599988 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:25:52.600062 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600019 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw8w\" (UniqueName: \"kubernetes.io/projected/1e1883f5-d573-490b-964d-821444217181-kube-api-access-2gw8w\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.600074 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.100046432 +0000 UTC m=+32.649272853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.600189 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600189 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e1883f5-d573-490b-964d-821444217181-tmp-dir\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.600254 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600251 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600524 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.600275 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.100257986 +0000 UTC m=+32.649484387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:25:52.600524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600320 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600357 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8hw\" (UniqueName: \"kubernetes.io/projected/33175606-6499-415b-b273-193922870d52-kube-api-access-4v8hw\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:52.600524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600385 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600407 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600526 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e1883f5-d573-490b-964d-821444217181-config-volume\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600623 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600711 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600769 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzz5b\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600796 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.600852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600800 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.601192 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.600869 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:52.601192 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.600875 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:25:52.601192 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.600970 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.100953358 +0000 UTC m=+32.650179756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:25:52.601331 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.601279 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.601374 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.601349 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:52.605125 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.604986 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.605219 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.605008 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.608834 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.608813 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw8w\" (UniqueName: \"kubernetes.io/projected/1e1883f5-d573-490b-964d-821444217181-kube-api-access-2gw8w\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:52.609081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.609052 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.609855 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.609837 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzz5b\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:52.701869 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.701777 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8hw\" (UniqueName: \"kubernetes.io/projected/33175606-6499-415b-b273-193922870d52-kube-api-access-4v8hw\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:52.702045 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.701880 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:52.702045 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.701997 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:25:52.702219 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:52.702053 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:53.202037361 +0000 UTC m=+32.751263768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:25:52.710597 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:52.710568 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8hw\" (UniqueName: \"kubernetes.io/projected/33175606-6499-415b-b273-193922870d52-kube-api-access-4v8hw\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:53.105194 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.105153 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:53.105383 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.105213 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:53.105383 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.105260 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:53.105383 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105314 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:25:53.105383 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105374 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:25:53.105383 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105386 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:54.105368071 +0000 UTC m=+33.654594473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:25:53.105622 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105386 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:25:53.105622 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105405 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:25:53.105622 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105422 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:54.105406931 +0000 UTC m=+33.654633329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:25:53.105622 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.105450 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:54.10543294 +0000 UTC m=+33.654659353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:25:53.206616 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.206573 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:53.206817 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.206748 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:25:53.206888 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.206826 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:54.206809659 +0000 UTC m=+33.756036059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:25:53.711957 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.711917 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:53.712430 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.712066 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:53.712430 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.712132 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:26:25.712117484 +0000 UTC m=+65.261343892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:25:53.813073 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.813023 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:53.813297 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.813187 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:25:53.813297 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.813205 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:25:53.813297 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.813217 2583 projected.go:194] Error preparing data for projected volume kube-api-access-zzh2q for pod openshift-network-diagnostics/network-check-target-pghpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:53.813297 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:53.813271 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q podName:098e6bc9-3c71-4b00-84be-72f47d753e5a nodeName:}" failed. No retries permitted until 2026-04-21 14:26:25.813253949 +0000 UTC m=+65.362480360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zzh2q" (UniqueName: "kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q") pod "network-check-target-pghpm" (UID: "098e6bc9-3c71-4b00-84be-72f47d753e5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:25:53.923502 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.923464 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:25:53.923672 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.923464 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:25:53.923756 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.923468 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:25:53.927135 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:25:53.927135 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927129 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9wlmf\"" Apr 21 14:25:53.927333 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927163 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:25:53.927333 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927196 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fd928\"" Apr 21 14:25:53.927333 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:25:53.927475 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:53.927460 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 14:25:54.116142 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:54.116103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:54.116346 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:54.116163 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:54.116346 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:54.116219 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:54.116346 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116292 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:25:54.116346 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116323 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:25:54.116346 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116317 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:25:54.116605 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116323 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:25:54.116605 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116394 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:56.116372936 +0000 UTC m=+35.665599349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:25:54.116605 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116419 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:56.116402182 +0000 UTC m=+35.665628581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:25:54.116605 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.116438 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:56.116429087 +0000 UTC m=+35.665655486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:25:54.216954 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:54.216910 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:54.217152 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.217045 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:25:54.217152 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:54.217125 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:25:56.217108073 +0000 UTC m=+35.766334486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:25:56.133378 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:56.133340 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:56.133445 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133475 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:56.133492 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133555 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:00.133532289 +0000 UTC m=+39.682758689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133554 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133610 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133628 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133617 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:00.133600819 +0000 UTC m=+39.682827218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:25:56.133896 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.133692 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:00.133669208 +0000 UTC m=+39.682895623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:25:56.234294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:56.234251 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:25:56.234542 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.234411 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:25:56.234542 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:25:56.234488 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:00.23447225 +0000 UTC m=+39.783698672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:25:58.105294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:58.105258 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="d11ab3c7522fdcd3c01a65b0b44b78197134dd05f1143f5914fa2b160e8bb15c" exitCode=0 Apr 21 14:25:58.105801 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:58.105321 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"d11ab3c7522fdcd3c01a65b0b44b78197134dd05f1143f5914fa2b160e8bb15c"} Apr 21 14:25:59.110374 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:59.110335 2583 generic.go:358] "Generic (PLEG): container finished" podID="71044697-c141-4bb8-a13d-2e24d233501f" containerID="3d0886d6c1685d9834d282728342aed37894105a49d94ec8f57d311ed2acf5ce" exitCode=0 Apr 21 14:25:59.110785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:25:59.110386 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerDied","Data":"3d0886d6c1685d9834d282728342aed37894105a49d94ec8f57d311ed2acf5ce"} Apr 21 14:26:00.115081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.115047 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88f5f" event={"ID":"71044697-c141-4bb8-a13d-2e24d233501f","Type":"ContainerStarted","Data":"8f675ae5eda3f38003b545d9901009fc555a6556ff3073d330eeffff214a8ee2"} Apr 21 14:26:00.137418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.137371 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-88f5f" podStartSLOduration=4.438275684 podStartE2EDuration="39.137355683s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:25:22.317544698 +0000 UTC m=+1.866771098" lastFinishedPulling="2026-04-21 14:25:57.016624695 +0000 UTC m=+36.565851097" observedRunningTime="2026-04-21 14:26:00.136397719 +0000 UTC m=+39.685624141" watchObservedRunningTime="2026-04-21 14:26:00.137355683 +0000 UTC m=+39.686582103" Apr 21 14:26:00.167671 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.167630 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:26:00.167884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.167741 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:26:00.167884 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.167779 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:26:00.167884 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167825 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167877 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167903 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.167882123 +0000 UTC m=+47.717108529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167936 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.167922885 +0000 UTC m=+47.717149288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167879 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167952 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:26:00.168038 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.167985 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.167975286 +0000 UTC m=+47.717201703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:26:00.268607 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:00.268563 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:26:00.268848 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.268708 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:00.268848 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:00.268791 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:08.268775375 +0000 UTC m=+47.818001779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:26:07.926973 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:07.926932 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:26:07.930100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:07.930073 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fddef6cc-f623-494d-af92-4fd8811401a4-original-pull-secret\") pod \"global-pull-secret-syncer-66rqj\" (UID: \"fddef6cc-f623-494d-af92-4fd8811401a4\") " pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:26:08.043227 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.043176 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-66rqj" Apr 21 14:26:08.170824 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.170787 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-66rqj"] Apr 21 14:26:08.175094 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:26:08.175064 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfddef6cc_f623_494d_af92_4fd8811401a4.slice/crio-0fd4daf988d03befe17163039eae18bb28f17aa8f5211f44a7b521a149c465bd WatchSource:0}: Error finding container 0fd4daf988d03befe17163039eae18bb28f17aa8f5211f44a7b521a149c465bd: Status 404 returned error can't find the container with id 0fd4daf988d03befe17163039eae18bb28f17aa8f5211f44a7b521a149c465bd Apr 21 14:26:08.229257 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.229225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:26:08.229415 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.229274 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:26:08.229415 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.229337 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:26:08.229415 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229382 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:26:08.229415 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229405 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:26:08.229534 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229463 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:26:08.229534 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229464 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:08.229596 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229522 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.229505651 +0000 UTC m=+63.778732050 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:26:08.229596 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229561 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.229545776 +0000 UTC m=+63.778772185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:26:08.229596 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.229577 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.229567494 +0000 UTC m=+63.778793892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:26:08.330751 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:08.330696 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:26:08.330964 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.330864 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:08.330964 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:08.330944 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:24.330923986 +0000 UTC m=+63.880150401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:26:09.133498 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:09.133455 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66rqj" event={"ID":"fddef6cc-f623-494d-af92-4fd8811401a4","Type":"ContainerStarted","Data":"0fd4daf988d03befe17163039eae18bb28f17aa8f5211f44a7b521a149c465bd"} Apr 21 14:26:13.142744 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:13.142688 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-66rqj" event={"ID":"fddef6cc-f623-494d-af92-4fd8811401a4","Type":"ContainerStarted","Data":"14573bdd7bb9dff3f0181285cfa20bd109b2a3821be45b430af52dbf61626abb"} Apr 21 14:26:13.163220 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:13.163157 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-66rqj" podStartSLOduration=33.173228573 podStartE2EDuration="37.163143802s" podCreationTimestamp="2026-04-21 14:25:36 +0000 UTC" firstStartedPulling="2026-04-21 14:26:08.176783954 +0000 UTC m=+47.726010353" lastFinishedPulling="2026-04-21 14:26:12.166699174 +0000 UTC m=+51.715925582" observedRunningTime="2026-04-21 14:26:13.162763855 +0000 UTC m=+52.711990273" watchObservedRunningTime="2026-04-21 14:26:13.163143802 +0000 UTC m=+52.712370222" Apr 21 14:26:16.098112 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.098083 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh"] Apr 21 14:26:16.133638 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.133610 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh"] Apr 21 14:26:16.133803 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.133745 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.136265 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.136237 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 14:26:16.136449 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.136266 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 14:26:16.137423 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.137400 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 14:26:16.137525 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.137458 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 14:26:16.194835 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.194800 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6da1a60-6b4c-4412-b979-77b3fb44f43d-tmp\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.194835 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.194835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq4t\" (UniqueName: \"kubernetes.io/projected/b6da1a60-6b4c-4412-b979-77b3fb44f43d-kube-api-access-gvq4t\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.195014 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.194978 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b6da1a60-6b4c-4412-b979-77b3fb44f43d-klusterlet-config\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.296043 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.296009 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b6da1a60-6b4c-4412-b979-77b3fb44f43d-klusterlet-config\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.296212 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.296142 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6da1a60-6b4c-4412-b979-77b3fb44f43d-tmp\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.296212 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.296161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq4t\" (UniqueName: \"kubernetes.io/projected/b6da1a60-6b4c-4412-b979-77b3fb44f43d-kube-api-access-gvq4t\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.296533 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.296502 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6da1a60-6b4c-4412-b979-77b3fb44f43d-tmp\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.298507 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.298488 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b6da1a60-6b4c-4412-b979-77b3fb44f43d-klusterlet-config\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.304961 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.304935 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq4t\" (UniqueName: \"kubernetes.io/projected/b6da1a60-6b4c-4412-b979-77b3fb44f43d-kube-api-access-gvq4t\") pod \"klusterlet-addon-workmgr-78bd496cff-9n9wh\" (UID: \"b6da1a60-6b4c-4412-b979-77b3fb44f43d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.442924 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.442821 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:16.586952 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:16.586919 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh"] Apr 21 14:26:16.590030 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:26:16.590004 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6da1a60_6b4c_4412_b979_77b3fb44f43d.slice/crio-1bbd1ad848e57cd78b9dbabad55682c4ab32591061b4a309c6b545d556848edd WatchSource:0}: Error finding container 1bbd1ad848e57cd78b9dbabad55682c4ab32591061b4a309c6b545d556848edd: Status 404 returned error can't find the container with id 1bbd1ad848e57cd78b9dbabad55682c4ab32591061b4a309c6b545d556848edd Apr 21 14:26:17.150252 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:17.150213 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" event={"ID":"b6da1a60-6b4c-4412-b979-77b3fb44f43d","Type":"ContainerStarted","Data":"1bbd1ad848e57cd78b9dbabad55682c4ab32591061b4a309c6b545d556848edd"} Apr 21 14:26:19.097653 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:19.097622 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pcn9k" Apr 21 14:26:21.160083 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:21.160044 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" event={"ID":"b6da1a60-6b4c-4412-b979-77b3fb44f43d","Type":"ContainerStarted","Data":"78f6f53d2a0d2101f42e1b3fd2bfc5272e7e1a52ee4cf5813c717a78a90aadf5"} Apr 21 14:26:21.160574 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:21.160267 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:21.161895 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:21.161865 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:26:21.176357 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:21.176313 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" podStartSLOduration=1.3813093730000001 podStartE2EDuration="5.176301226s" podCreationTimestamp="2026-04-21 14:26:16 +0000 UTC" firstStartedPulling="2026-04-21 14:26:16.591928164 +0000 UTC m=+56.141154562" lastFinishedPulling="2026-04-21 14:26:20.38692 +0000 UTC m=+59.936146415" observedRunningTime="2026-04-21 14:26:21.174941811 +0000 UTC m=+60.724168231" watchObservedRunningTime="2026-04-21 14:26:21.176301226 +0000 UTC m=+60.725527646" Apr 21 14:26:24.259103 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:24.259066 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:24.259137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:24.259174 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259228 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259277 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259281 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259291 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259298 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:56.259281962 +0000 UTC m=+95.808508362 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259336 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:56.259320363 +0000 UTC m=+95.808546763 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:26:24.259567 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.259362 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:56.259351966 +0000 UTC m=+95.808578368 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:26:24.360490 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:24.360450 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:26:24.360669 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.360597 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:24.360669 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:24.360665 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:26:56.360649428 +0000 UTC m=+95.909875826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:26:25.771959 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.771903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:26:25.774855 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.774835 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:26:25.783167 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:25.783148 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:26:25.783229 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:25.783220 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:27:29.783202507 +0000 UTC m=+129.332428906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : secret "metrics-daemon-secret" not found Apr 21 14:26:25.873212 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.873168 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:26:25.875876 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.875854 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:26:25.886176 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.886157 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:26:25.897418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:25.897390 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzh2q\" (UniqueName: \"kubernetes.io/projected/098e6bc9-3c71-4b00-84be-72f47d753e5a-kube-api-access-zzh2q\") pod \"network-check-target-pghpm\" (UID: \"098e6bc9-3c71-4b00-84be-72f47d753e5a\") " pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:26:26.040424 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:26.040348 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fd928\"" Apr 21 14:26:26.048047 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:26.048024 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:26:26.163508 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:26.163477 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pghpm"] Apr 21 14:26:26.171015 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:26.170980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pghpm" event={"ID":"098e6bc9-3c71-4b00-84be-72f47d753e5a","Type":"ContainerStarted","Data":"d7c243fa47bf1475a332cdb072da7fa0a19372b9623e5e95c31f45d651f0f4e5"} Apr 21 14:26:30.180914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:30.180875 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pghpm" event={"ID":"098e6bc9-3c71-4b00-84be-72f47d753e5a","Type":"ContainerStarted","Data":"8be3c7ecae03f25b3a4a6b895f3cc5fe2f6c7eb51438df9127e9dba83b9f13b4"} Apr 21 14:26:30.181319 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:30.180993 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:26:30.196292 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:30.196242 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pghpm" podStartSLOduration=65.953950084 podStartE2EDuration="1m9.196226393s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:26:26.168967733 +0000 UTC m=+65.718194147" lastFinishedPulling="2026-04-21 14:26:29.411244043 +0000 UTC m=+68.960470456" observedRunningTime="2026-04-21 14:26:30.195660692 +0000 UTC m=+69.744887113" watchObservedRunningTime="2026-04-21 14:26:30.196226393 +0000 UTC m=+69.745452813" Apr 21 14:26:56.308919 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:56.308875 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:56.308932 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:56.308969 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309023 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309090 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309096 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309118 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f4494879d-mn7p6: secret "image-registry-tls" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309092 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:00.309077351 +0000 UTC m=+159.858303749 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309161 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls podName:1e1883f5-d573-490b-964d-821444217181 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:00.309145238 +0000 UTC m=+159.858371638 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls") pod "dns-default-flqfn" (UID: "1e1883f5-d573-490b-964d-821444217181") : secret "dns-default-metrics-tls" not found Apr 21 14:26:56.309384 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.309180 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls podName:149199e9-acc2-4844-9b85-9231431c2811 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:00.309170287 +0000 UTC m=+159.858396688 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls") pod "image-registry-6f4494879d-mn7p6" (UID: "149199e9-acc2-4844-9b85-9231431c2811") : secret "image-registry-tls" not found Apr 21 14:26:56.409608 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:26:56.409572 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:26:56.409800 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.409763 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:26:56.409849 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:26:56.409828 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:00.409814343 +0000 UTC m=+159.959040741 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:27:01.185650 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:01.185613 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pghpm" Apr 21 14:27:29.862165 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:29.862105 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:27:29.862661 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:29.862255 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:27:29.862661 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:29.862334 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs podName:425eadc2-ce6c-4aeb-9856-41d3b15c076b nodeName:}" failed. No retries permitted until 2026-04-21 14:29:31.862316919 +0000 UTC m=+251.411543318 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs") pod "network-metrics-daemon-bcph6" (UID: "425eadc2-ce6c-4aeb-9856-41d3b15c076b") : secret "metrics-daemon-secret" not found Apr 21 14:27:41.275048 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:41.275018 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fr8hk_0e814615-0b94-4de9-9bfd-1b36c817910a/dns-node-resolver/0.log" Apr 21 14:27:41.875553 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:41.875525 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2gtk9_9cd06c0c-27bc-443c-a928-76a45a2b2514/node-ca/0.log" Apr 21 14:27:45.551071 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.551035 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn"] Apr 21 14:27:45.553632 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.553616 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.556135 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.556113 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 14:27:45.556135 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.556120 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fwm66\"" Apr 21 14:27:45.557166 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.557150 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 14:27:45.557215 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.557162 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 14:27:45.557215 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.557175 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:27:45.565333 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.565312 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn"] Apr 21 14:27:45.680904 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.680868 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mnt\" (UniqueName: \"kubernetes.io/projected/43072f22-397f-42e1-a2fd-0d96e71b4412-kube-api-access-92mnt\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.681100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.680950 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43072f22-397f-42e1-a2fd-0d96e71b4412-config\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.681100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.680986 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43072f22-397f-42e1-a2fd-0d96e71b4412-serving-cert\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.781997 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.781953 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43072f22-397f-42e1-a2fd-0d96e71b4412-config\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.782173 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.782009 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43072f22-397f-42e1-a2fd-0d96e71b4412-serving-cert\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.782173 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.782154 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92mnt\" (UniqueName: \"kubernetes.io/projected/43072f22-397f-42e1-a2fd-0d96e71b4412-kube-api-access-92mnt\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.782504 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.782479 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43072f22-397f-42e1-a2fd-0d96e71b4412-config\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.784260 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.784240 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43072f22-397f-42e1-a2fd-0d96e71b4412-serving-cert\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.791076 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.791053 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mnt\" (UniqueName: \"kubernetes.io/projected/43072f22-397f-42e1-a2fd-0d96e71b4412-kube-api-access-92mnt\") pod \"service-ca-operator-d6fc45fc5-24zfn\" (UID: \"43072f22-397f-42e1-a2fd-0d96e71b4412\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.862665 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.862628 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" Apr 21 14:27:45.990021 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:45.989988 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn"] Apr 21 14:27:45.994346 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:27:45.994322 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43072f22_397f_42e1_a2fd_0d96e71b4412.slice/crio-06418cdac9be34fc6db7123f36e731867dd0ceead4a38da04937ce682023c7b4 WatchSource:0}: Error finding container 06418cdac9be34fc6db7123f36e731867dd0ceead4a38da04937ce682023c7b4: Status 404 returned error can't find the container with id 06418cdac9be34fc6db7123f36e731867dd0ceead4a38da04937ce682023c7b4 Apr 21 14:27:46.322095 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.322059 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" event={"ID":"43072f22-397f-42e1-a2fd-0d96e71b4412","Type":"ContainerStarted","Data":"06418cdac9be34fc6db7123f36e731867dd0ceead4a38da04937ce682023c7b4"} Apr 21 14:27:46.657459 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.657374 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg"] Apr 21 14:27:46.660396 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.660369 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" Apr 21 14:27:46.662785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.662762 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 14:27:46.663098 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.663071 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 14:27:46.663709 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.663692 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-4rxsp\"" Apr 21 14:27:46.670957 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.670933 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg"] Apr 21 14:27:46.789996 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.789957 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689bx\" (UniqueName: \"kubernetes.io/projected/9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2-kube-api-access-689bx\") pod \"migrator-74bb7799d9-zdnvg\" (UID: \"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" Apr 21 14:27:46.890778 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.890737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-689bx\" (UniqueName: \"kubernetes.io/projected/9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2-kube-api-access-689bx\") pod \"migrator-74bb7799d9-zdnvg\" (UID: \"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" Apr 21 14:27:46.901373 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.901343 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-689bx\" (UniqueName: \"kubernetes.io/projected/9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2-kube-api-access-689bx\") pod \"migrator-74bb7799d9-zdnvg\" (UID: \"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" Apr 21 14:27:46.973070 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:46.972970 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" Apr 21 14:27:47.132150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:47.132116 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg"] Apr 21 14:27:47.136189 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:27:47.136143 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8477fc_9b40_4b7b_8db7_edfd1d2e6ee2.slice/crio-28f15c74bce84ece7c5464a5b38f6974dc3b248b73a5288aeb7b6d61cfad0250 WatchSource:0}: Error finding container 28f15c74bce84ece7c5464a5b38f6974dc3b248b73a5288aeb7b6d61cfad0250: Status 404 returned error can't find the container with id 28f15c74bce84ece7c5464a5b38f6974dc3b248b73a5288aeb7b6d61cfad0250 Apr 21 14:27:47.325958 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:47.325921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" event={"ID":"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2","Type":"ContainerStarted","Data":"28f15c74bce84ece7c5464a5b38f6974dc3b248b73a5288aeb7b6d61cfad0250"} Apr 21 14:27:48.329754 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:48.329698 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" event={"ID":"43072f22-397f-42e1-a2fd-0d96e71b4412","Type":"ContainerStarted","Data":"6d9481bf81528bd5f44e09b09d190a2297892b6910345f67a6629c6bac45ff12"} Apr 21 14:27:48.345794 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:48.345720 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" podStartSLOduration=1.347000351 podStartE2EDuration="3.345701784s" podCreationTimestamp="2026-04-21 14:27:45 +0000 UTC" firstStartedPulling="2026-04-21 14:27:45.99658065 +0000 UTC m=+145.545807049" lastFinishedPulling="2026-04-21 14:27:47.995282081 +0000 UTC m=+147.544508482" observedRunningTime="2026-04-21 14:27:48.344072244 +0000 UTC m=+147.893298665" watchObservedRunningTime="2026-04-21 14:27:48.345701784 +0000 UTC m=+147.894928208" Apr 21 14:27:49.334191 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:49.334155 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" event={"ID":"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2","Type":"ContainerStarted","Data":"f2b20f476068346f51bb1b82b1811991ad0bd91b75cfcc7116588d865913489e"} Apr 21 14:27:49.334191 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:49.334195 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" event={"ID":"9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2","Type":"ContainerStarted","Data":"8cdeb9a3c60fa455b950a4f7a584957c578f2dc1076b6e2dd84674e5d50d7a95"} Apr 21 14:27:49.350240 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:49.350187 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zdnvg" podStartSLOduration=1.742032596 podStartE2EDuration="3.350169786s" podCreationTimestamp="2026-04-21 14:27:46 +0000 UTC" firstStartedPulling="2026-04-21 14:27:47.138306479 +0000 UTC m=+146.687532896" lastFinishedPulling="2026-04-21 14:27:48.746443686 +0000 UTC m=+148.295670086" observedRunningTime="2026-04-21 14:27:49.349325752 +0000 UTC m=+148.898552170" watchObservedRunningTime="2026-04-21 14:27:49.350169786 +0000 UTC m=+148.899396208" Apr 21 14:27:55.389163 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:55.389117 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" podUID="52f4ef6e-0001-42d5-acda-d8d1b7ce4e20" Apr 21 14:27:55.407272 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:55.407238 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" podUID="149199e9-acc2-4844-9b85-9231431c2811" Apr 21 14:27:55.423588 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:55.423552 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-flqfn" podUID="1e1883f5-d573-490b-964d-821444217181" Apr 21 14:27:55.526684 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:55.526635 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9pxm9" podUID="33175606-6499-415b-b273-193922870d52" Apr 21 14:27:56.353338 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:56.353305 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:27:56.353495 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:27:56.353305 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:27:56.949201 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:27:56.949159 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bcph6" podUID="425eadc2-ce6c-4aeb-9856-41d3b15c076b" Apr 21 14:28:00.402644 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.402607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:28:00.403077 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.402657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:28:00.403077 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.402689 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:00.403077 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:00.402785 2583 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 14:28:00.403077 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:00.402865 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert podName:52f4ef6e-0001-42d5-acda-d8d1b7ce4e20 nodeName:}" failed. No retries permitted until 2026-04-21 14:30:02.402847261 +0000 UTC m=+281.952073661 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bcjt5" (UID: "52f4ef6e-0001-42d5-acda-d8d1b7ce4e20") : secret "networking-console-plugin-cert" not found Apr 21 14:28:00.405097 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.405067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1e1883f5-d573-490b-964d-821444217181-metrics-tls\") pod \"dns-default-flqfn\" (UID: \"1e1883f5-d573-490b-964d-821444217181\") " pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:00.405228 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.405215 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"image-registry-6f4494879d-mn7p6\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:28:00.503784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.503753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:28:00.503934 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:00.503863 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:28:00.503934 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:00.503920 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert podName:33175606-6499-415b-b273-193922870d52 nodeName:}" failed. No retries permitted until 2026-04-21 14:30:02.503906837 +0000 UTC m=+282.053133235 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert") pod "ingress-canary-9pxm9" (UID: "33175606-6499-415b-b273-193922870d52") : secret "canary-serving-cert" not found Apr 21 14:28:00.557143 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.557118 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4v5jp\"" Apr 21 14:28:00.565194 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.565170 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:28:00.678537 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:00.678464 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:28:00.681572 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:00.681547 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149199e9_acc2_4844_9b85_9231431c2811.slice/crio-92ab120f0fc0869b2e5f896f07ab59a68387131b2464f5d1862b87b20201becb WatchSource:0}: Error finding container 92ab120f0fc0869b2e5f896f07ab59a68387131b2464f5d1862b87b20201becb: Status 404 returned error can't find the container with id 92ab120f0fc0869b2e5f896f07ab59a68387131b2464f5d1862b87b20201becb Apr 21 14:28:01.366860 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:01.366825 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" event={"ID":"149199e9-acc2-4844-9b85-9231431c2811","Type":"ContainerStarted","Data":"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30"} Apr 21 14:28:01.366860 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:01.366863 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" event={"ID":"149199e9-acc2-4844-9b85-9231431c2811","Type":"ContainerStarted","Data":"92ab120f0fc0869b2e5f896f07ab59a68387131b2464f5d1862b87b20201becb"} Apr 21 14:28:01.367213 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:01.366951 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:28:01.388246 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:01.388201 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" podStartSLOduration=160.388187503 podStartE2EDuration="2m40.388187503s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:28:01.386667639 +0000 UTC m=+160.935894059" watchObservedRunningTime="2026-04-21 14:28:01.388187503 +0000 UTC m=+160.937413924" Apr 21 14:28:08.923642 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:08.923537 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:08.924092 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:08.923676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:28:08.924532 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:08.924505 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:28:08.928539 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:08.928520 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-88srx\"" Apr 21 14:28:08.934129 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:08.934109 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:09.053102 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:09.053067 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flqfn"] Apr 21 14:28:09.056492 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:09.056460 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e1883f5_d573_490b_964d_821444217181.slice/crio-37d27da6c98f3fde7c5fa559e992d098c15b4961ae93c7059ccd348224e81414 WatchSource:0}: Error finding container 37d27da6c98f3fde7c5fa559e992d098c15b4961ae93c7059ccd348224e81414: Status 404 returned error can't find the container with id 37d27da6c98f3fde7c5fa559e992d098c15b4961ae93c7059ccd348224e81414 Apr 21 14:28:09.386773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:09.386710 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flqfn" event={"ID":"1e1883f5-d573-490b-964d-821444217181","Type":"ContainerStarted","Data":"37d27da6c98f3fde7c5fa559e992d098c15b4961ae93c7059ccd348224e81414"} Apr 21 14:28:11.392898 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:11.392857 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flqfn" event={"ID":"1e1883f5-d573-490b-964d-821444217181","Type":"ContainerStarted","Data":"981855db74604e901c35c707b0dc88f37b9a1405a950768952530434c6fbde67"} Apr 21 14:28:11.393289 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:11.392906 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flqfn" event={"ID":"1e1883f5-d573-490b-964d-821444217181","Type":"ContainerStarted","Data":"ac361425520032b76f9c468538c783263cdc6890332078ac1da3e3d4b3fc8f2b"} Apr 21 14:28:11.393289 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:11.392978 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:11.409564 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:11.409513 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-flqfn" podStartSLOduration=138.027087514 podStartE2EDuration="2m19.409496338s" podCreationTimestamp="2026-04-21 14:25:52 +0000 UTC" firstStartedPulling="2026-04-21 14:28:09.058246095 +0000 UTC m=+168.607472497" lastFinishedPulling="2026-04-21 14:28:10.440654904 +0000 UTC m=+169.989881321" observedRunningTime="2026-04-21 14:28:11.408763677 +0000 UTC m=+170.957990097" watchObservedRunningTime="2026-04-21 14:28:11.409496338 +0000 UTC m=+170.958722758" Apr 21 14:28:12.448085 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.448046 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw"] Apr 21 14:28:12.452387 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.452363 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:12.455459 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.455421 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 14:28:12.455459 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.455421 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mf5vk\"" Apr 21 14:28:12.462751 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.462705 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw"] Apr 21 14:28:12.572375 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.572344 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c67k6"] Apr 21 14:28:12.576604 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.576582 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.579009 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.578986 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 14:28:12.579145 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.578985 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qlm9d\"" Apr 21 14:28:12.579145 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.579001 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 14:28:12.579222 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.579151 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 14:28:12.579776 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.579759 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 14:28:12.589990 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.589953 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c67k6"] Apr 21 14:28:12.599535 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.599506 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6a7d3a3-8dbf-493f-99dd-7922d8495302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xfvsw\" (UID: \"b6a7d3a3-8dbf-493f-99dd-7922d8495302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:12.701033 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.700926 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6a7d3a3-8dbf-493f-99dd-7922d8495302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xfvsw\" (UID: \"b6a7d3a3-8dbf-493f-99dd-7922d8495302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:12.701033 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.700982 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37d8c7f0-2689-4181-9759-32285450db5e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.701033 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.701020 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxt6q\" (UniqueName: \"kubernetes.io/projected/37d8c7f0-2689-4181-9759-32285450db5e-kube-api-access-sxt6q\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.701283 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.701081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37d8c7f0-2689-4181-9759-32285450db5e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.701283 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.701229 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37d8c7f0-2689-4181-9759-32285450db5e-data-volume\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.701283 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.701260 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37d8c7f0-2689-4181-9759-32285450db5e-crio-socket\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.703451 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.703431 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6a7d3a3-8dbf-493f-99dd-7922d8495302-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xfvsw\" (UID: \"b6a7d3a3-8dbf-493f-99dd-7922d8495302\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:12.761817 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.761775 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:12.801874 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.801839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37d8c7f0-2689-4181-9759-32285450db5e-data-volume\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.801874 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.801882 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37d8c7f0-2689-4181-9759-32285450db5e-crio-socket\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.801950 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37d8c7f0-2689-4181-9759-32285450db5e-crio-socket\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.801977 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37d8c7f0-2689-4181-9759-32285450db5e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.802007 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxt6q\" (UniqueName: \"kubernetes.io/projected/37d8c7f0-2689-4181-9759-32285450db5e-kube-api-access-sxt6q\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.802050 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37d8c7f0-2689-4181-9759-32285450db5e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802318 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.802199 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37d8c7f0-2689-4181-9759-32285450db5e-data-volume\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.802539 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.802521 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37d8c7f0-2689-4181-9759-32285450db5e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.804356 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.804326 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37d8c7f0-2689-4181-9759-32285450db5e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.814342 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.813935 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxt6q\" (UniqueName: \"kubernetes.io/projected/37d8c7f0-2689-4181-9759-32285450db5e-kube-api-access-sxt6q\") pod \"insights-runtime-extractor-c67k6\" (UID: \"37d8c7f0-2689-4181-9759-32285450db5e\") " pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.883826 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.883790 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw"] Apr 21 14:28:12.885556 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:12.885537 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c67k6" Apr 21 14:28:12.887322 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:12.887295 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a7d3a3_8dbf_493f_99dd_7922d8495302.slice/crio-65439591be75a8a13ec9f76ee9d2677dc72d13eeebbd8e203caffa5fe26ddc26 WatchSource:0}: Error finding container 65439591be75a8a13ec9f76ee9d2677dc72d13eeebbd8e203caffa5fe26ddc26: Status 404 returned error can't find the container with id 65439591be75a8a13ec9f76ee9d2677dc72d13eeebbd8e203caffa5fe26ddc26 Apr 21 14:28:13.011471 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:13.011431 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c67k6"] Apr 21 14:28:13.014334 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:13.014305 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d8c7f0_2689_4181_9759_32285450db5e.slice/crio-70a8601f9e500fb2dcca55faf8c5ac757270f47f556c8a743ad4deade07408d5 WatchSource:0}: Error finding container 70a8601f9e500fb2dcca55faf8c5ac757270f47f556c8a743ad4deade07408d5: Status 404 returned error can't find the container with id 70a8601f9e500fb2dcca55faf8c5ac757270f47f556c8a743ad4deade07408d5 Apr 21 14:28:13.398952 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:13.398913 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c67k6" event={"ID":"37d8c7f0-2689-4181-9759-32285450db5e","Type":"ContainerStarted","Data":"759018a3eed33f3ab40b2d15fdeddb60aaa165b5e8d1b9cba39b5423c15b7aec"} Apr 21 14:28:13.398952 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:13.398954 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c67k6" event={"ID":"37d8c7f0-2689-4181-9759-32285450db5e","Type":"ContainerStarted","Data":"70a8601f9e500fb2dcca55faf8c5ac757270f47f556c8a743ad4deade07408d5"} Apr 21 14:28:13.399900 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:13.399867 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" event={"ID":"b6a7d3a3-8dbf-493f-99dd-7922d8495302","Type":"ContainerStarted","Data":"65439591be75a8a13ec9f76ee9d2677dc72d13eeebbd8e203caffa5fe26ddc26"} Apr 21 14:28:14.405021 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.404980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c67k6" event={"ID":"37d8c7f0-2689-4181-9759-32285450db5e","Type":"ContainerStarted","Data":"ccbb0047b46c27fb2bd8526a2c3cf21d55c46c4b6b5fc0f9cfad0b7418e60d90"} Apr 21 14:28:14.406587 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.406554 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" event={"ID":"b6a7d3a3-8dbf-493f-99dd-7922d8495302","Type":"ContainerStarted","Data":"e12079dd859c666a10776ddbdae8b1ec6d28ec006f61d94dd38fbf13fd85f6c7"} Apr 21 14:28:14.407554 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.406823 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:14.412641 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.412617 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" Apr 21 14:28:14.426612 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.426563 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xfvsw" podStartSLOduration=0.993605909 podStartE2EDuration="2.426550829s" podCreationTimestamp="2026-04-21 14:28:12 +0000 UTC" firstStartedPulling="2026-04-21 14:28:12.889589558 +0000 UTC m=+172.438815969" lastFinishedPulling="2026-04-21 14:28:14.322534486 +0000 UTC m=+173.871760889" observedRunningTime="2026-04-21 14:28:14.426068105 +0000 UTC m=+173.975294527" watchObservedRunningTime="2026-04-21 14:28:14.426550829 +0000 UTC m=+173.975777250" Apr 21 14:28:14.823956 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.823920 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gnbgt"] Apr 21 14:28:14.827383 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.827359 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:14.830908 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.830760 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 14:28:14.830908 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.830808 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 14:28:14.831165 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.831149 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 14:28:14.831235 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.831149 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 14:28:14.831294 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.831275 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-prqsf\"" Apr 21 14:28:14.831845 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.831827 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 14:28:14.843530 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.843470 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gnbgt"] Apr 21 14:28:14.919882 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.919832 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbv2v\" (UniqueName: \"kubernetes.io/projected/10e06e54-27b1-40e3-868d-0417cbbfed4c-kube-api-access-sbv2v\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:14.920063 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.920003 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:14.920124 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.920065 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10e06e54-27b1-40e3-868d-0417cbbfed4c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:14.920124 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:14.920099 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.020978 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.020930 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbv2v\" (UniqueName: \"kubernetes.io/projected/10e06e54-27b1-40e3-868d-0417cbbfed4c-kube-api-access-sbv2v\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.021151 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.021052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.021151 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.021104 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10e06e54-27b1-40e3-868d-0417cbbfed4c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.021151 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.021137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.021330 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:15.021189 2583 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 14:28:15.021330 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:15.021250 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls podName:10e06e54-27b1-40e3-868d-0417cbbfed4c nodeName:}" failed. No retries permitted until 2026-04-21 14:28:15.5212322 +0000 UTC m=+175.070458601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gnbgt" (UID: "10e06e54-27b1-40e3-868d-0417cbbfed4c") : secret "prometheus-operator-tls" not found Apr 21 14:28:15.021948 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.021924 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10e06e54-27b1-40e3-868d-0417cbbfed4c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.023855 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.023833 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.031272 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.031239 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbv2v\" (UniqueName: \"kubernetes.io/projected/10e06e54-27b1-40e3-868d-0417cbbfed4c-kube-api-access-sbv2v\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.411269 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.411237 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c67k6" event={"ID":"37d8c7f0-2689-4181-9759-32285450db5e","Type":"ContainerStarted","Data":"ad0197e8bf928fe2326f932fcb2e5a900879eb01b17c0229cd828591a4803dd2"} Apr 21 14:28:15.427984 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.427939 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c67k6" podStartSLOduration=1.210707291 podStartE2EDuration="3.427924753s" podCreationTimestamp="2026-04-21 14:28:12 +0000 UTC" firstStartedPulling="2026-04-21 14:28:13.074855688 +0000 UTC m=+172.624082087" lastFinishedPulling="2026-04-21 14:28:15.292073148 +0000 UTC m=+174.841299549" observedRunningTime="2026-04-21 14:28:15.427076018 +0000 UTC m=+174.976302439" watchObservedRunningTime="2026-04-21 14:28:15.427924753 +0000 UTC m=+174.977151173" Apr 21 14:28:15.524745 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.524690 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.527070 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.527048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/10e06e54-27b1-40e3-868d-0417cbbfed4c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gnbgt\" (UID: \"10e06e54-27b1-40e3-868d-0417cbbfed4c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.737580 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.737540 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" Apr 21 14:28:15.855033 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:15.855001 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gnbgt"] Apr 21 14:28:15.858278 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:15.858248 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e06e54_27b1_40e3_868d_0417cbbfed4c.slice/crio-6df0bf1a181bda764babc1efc6b0276edb35ae1ed155cc5446b1e3ff814dd4ec WatchSource:0}: Error finding container 6df0bf1a181bda764babc1efc6b0276edb35ae1ed155cc5446b1e3ff814dd4ec: Status 404 returned error can't find the container with id 6df0bf1a181bda764babc1efc6b0276edb35ae1ed155cc5446b1e3ff814dd4ec Apr 21 14:28:16.414896 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:16.414853 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" event={"ID":"10e06e54-27b1-40e3-868d-0417cbbfed4c","Type":"ContainerStarted","Data":"6df0bf1a181bda764babc1efc6b0276edb35ae1ed155cc5446b1e3ff814dd4ec"} Apr 21 14:28:17.418687 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:17.418597 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" event={"ID":"10e06e54-27b1-40e3-868d-0417cbbfed4c","Type":"ContainerStarted","Data":"31a0450a601e0d7b8b11e859be8eb13a7728da9de2ac3e2caa99d87e4b1574cd"} Apr 21 14:28:17.418687 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:17.418637 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" event={"ID":"10e06e54-27b1-40e3-868d-0417cbbfed4c","Type":"ContainerStarted","Data":"0fe1a6d6cda2cd3c064ef93719ee2646e7e2750a13fb54f00f3492438d6e0a2b"} Apr 21 14:28:17.438508 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:17.438455 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gnbgt" podStartSLOduration=2.162733635 podStartE2EDuration="3.438440737s" podCreationTimestamp="2026-04-21 14:28:14 +0000 UTC" firstStartedPulling="2026-04-21 14:28:15.860132686 +0000 UTC m=+175.409359085" lastFinishedPulling="2026-04-21 14:28:17.135839785 +0000 UTC m=+176.685066187" observedRunningTime="2026-04-21 14:28:17.437048597 +0000 UTC m=+176.986275018" watchObservedRunningTime="2026-04-21 14:28:17.438440737 +0000 UTC m=+176.987667158" Apr 21 14:28:19.236980 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.236938 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9bgw4"] Apr 21 14:28:19.240351 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.240328 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.242617 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.242589 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 14:28:19.242789 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.242590 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 14:28:19.242789 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.242712 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 14:28:19.242789 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.242772 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k54lk\"" Apr 21 14:28:19.357540 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357499 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357567 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-accelerators-collector-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357609 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-textfile\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqk52\" (UniqueName: \"kubernetes.io/projected/0804186e-8a55-4aa4-9a48-ec393ed70e24-kube-api-access-gqk52\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357719 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-root\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357912 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357833 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-sys\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357912 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357862 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-wtmp\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.357912 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.357877 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-metrics-client-ca\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458423 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458375 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458449 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-accelerators-collector-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458490 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-textfile\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458530 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458554 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqk52\" (UniqueName: \"kubernetes.io/projected/0804186e-8a55-4aa4-9a48-ec393ed70e24-kube-api-access-gqk52\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458582 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-root\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458623 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-sys\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458650 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-wtmp\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458672 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-root\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458679 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-metrics-client-ca\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458718 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-sys\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.458825 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-wtmp\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.458944 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:19.458890 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 14:28:19.459281 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:28:19.458971 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls podName:0804186e-8a55-4aa4-9a48-ec393ed70e24 nodeName:}" failed. No retries permitted until 2026-04-21 14:28:19.958949339 +0000 UTC m=+179.508175738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls") pod "node-exporter-9bgw4" (UID: "0804186e-8a55-4aa4-9a48-ec393ed70e24") : secret "node-exporter-tls" not found Apr 21 14:28:19.459281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.459059 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-textfile\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.459416 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.459396 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-metrics-client-ca\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.459460 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.459438 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-accelerators-collector-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.460836 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.460818 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.467628 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.467601 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqk52\" (UniqueName: \"kubernetes.io/projected/0804186e-8a55-4aa4-9a48-ec393ed70e24-kube-api-access-gqk52\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.963034 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.962997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:19.965405 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:19.965376 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0804186e-8a55-4aa4-9a48-ec393ed70e24-node-exporter-tls\") pod \"node-exporter-9bgw4\" (UID: \"0804186e-8a55-4aa4-9a48-ec393ed70e24\") " pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:20.149924 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.149885 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9bgw4" Apr 21 14:28:20.159376 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:20.159343 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0804186e_8a55_4aa4_9a48_ec393ed70e24.slice/crio-b623ea487e92d5cbcad8339d559e82dfed69e3381265939bb8248d0954d5fb31 WatchSource:0}: Error finding container b623ea487e92d5cbcad8339d559e82dfed69e3381265939bb8248d0954d5fb31: Status 404 returned error can't find the container with id b623ea487e92d5cbcad8339d559e82dfed69e3381265939bb8248d0954d5fb31 Apr 21 14:28:20.272104 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.272024 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:28:20.277101 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.277079 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.279607 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279580 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 14:28:20.279607 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279601 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 14:28:20.279881 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279601 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 14:28:20.279881 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279601 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 14:28:20.279881 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279689 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 14:28:20.279881 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279580 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s7wg6\"" Apr 21 14:28:20.280048 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.279919 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 14:28:20.280121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.280107 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 14:28:20.280121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.280115 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 14:28:20.280213 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.280131 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 14:28:20.291638 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.291616 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:28:20.427631 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.427600 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9bgw4" event={"ID":"0804186e-8a55-4aa4-9a48-ec393ed70e24","Type":"ContainerStarted","Data":"b623ea487e92d5cbcad8339d559e82dfed69e3381265939bb8248d0954d5fb31"} Apr 21 14:28:20.467787 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467754 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467787 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467790 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467994 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467994 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467856 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4twg\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467994 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467936 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467994 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467966 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.467994 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.467988 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468229 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468011 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468229 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468098 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468229 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468152 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468355 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468240 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468355 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468275 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.468355 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.468340 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.568928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.568900 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569046 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.568944 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569094 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569067 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569094 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569088 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569194 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569123 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569194 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569157 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569450 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569315 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569450 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569358 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569450 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569389 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4twg\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569450 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569419 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569711 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569454 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569711 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569483 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569711 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569507 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569855 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.569930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.569867 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.570513 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.570489 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.571814 2583 patch_prober.go:28] interesting pod/image-registry-6f4494879d-mn7p6 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 14:28:20.572127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.571869 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" podUID="149199e9-acc2-4844-9b85-9231431c2811" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:28:20.572451 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572431 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572553 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572458 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572762 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572774 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572932 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572851 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.572932 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.572875 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.573257 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.573234 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.573687 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.573663 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.574357 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.574337 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.581762 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.581739 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4twg\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg\") pod \"alertmanager-main-0\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.588127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.588109 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:28:20.721005 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:20.720963 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:28:20.724655 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:20.724623 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e8a9c06_7424_4932_8589_906b80d6f3fe.slice/crio-20b43327f59b6e980bb81aa5114275fabd91fe687c4de51dbd2083a3f2a724c8 WatchSource:0}: Error finding container 20b43327f59b6e980bb81aa5114275fabd91fe687c4de51dbd2083a3f2a724c8: Status 404 returned error can't find the container with id 20b43327f59b6e980bb81aa5114275fabd91fe687c4de51dbd2083a3f2a724c8 Apr 21 14:28:21.161109 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.161070 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" podUID="b6da1a60-6b4c-4412-b979-77b3fb44f43d" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 21 14:28:21.398500 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.398410 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-flqfn" Apr 21 14:28:21.432085 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.432020 2583 generic.go:358] "Generic (PLEG): container finished" podID="b6da1a60-6b4c-4412-b979-77b3fb44f43d" containerID="78f6f53d2a0d2101f42e1b3fd2bfc5272e7e1a52ee4cf5813c717a78a90aadf5" exitCode=1 Apr 21 14:28:21.432085 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.432073 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" event={"ID":"b6da1a60-6b4c-4412-b979-77b3fb44f43d","Type":"ContainerDied","Data":"78f6f53d2a0d2101f42e1b3fd2bfc5272e7e1a52ee4cf5813c717a78a90aadf5"} Apr 21 14:28:21.432531 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.432510 2583 scope.go:117] "RemoveContainer" containerID="78f6f53d2a0d2101f42e1b3fd2bfc5272e7e1a52ee4cf5813c717a78a90aadf5" Apr 21 14:28:21.433785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.433756 2583 generic.go:358] "Generic (PLEG): container finished" podID="0804186e-8a55-4aa4-9a48-ec393ed70e24" containerID="9e7bb8e0d92c0e4e18479235c3351afed6cccf6b4105f672ab774a3038b98746" exitCode=0 Apr 21 14:28:21.433893 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.433863 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9bgw4" event={"ID":"0804186e-8a55-4aa4-9a48-ec393ed70e24","Type":"ContainerDied","Data":"9e7bb8e0d92c0e4e18479235c3351afed6cccf6b4105f672ab774a3038b98746"} Apr 21 14:28:21.435501 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:21.435419 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"20b43327f59b6e980bb81aa5114275fabd91fe687c4de51dbd2083a3f2a724c8"} Apr 21 14:28:22.175238 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.175201 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-db764b9c9-q4t6s"] Apr 21 14:28:22.178854 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.178836 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.182071 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.182047 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ppbwj\"" Apr 21 14:28:22.182615 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.182594 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 14:28:22.182615 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.182606 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 14:28:22.182615 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.182595 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 14:28:22.185947 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.185920 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-metrics-client-ca\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186055 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186005 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186055 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186039 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186148 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186085 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-www96\" (UniqueName: \"kubernetes.io/projected/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-kube-api-access-www96\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186148 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186122 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186237 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186159 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186237 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186204 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186329 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186264 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-grpc-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.186382 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186326 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-eigrpo958vopt\"" Apr 21 14:28:22.186451 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186433 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 14:28:22.186587 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.186572 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 14:28:22.205699 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.205672 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-db764b9c9-q4t6s"] Apr 21 14:28:22.286952 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.286920 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-www96\" (UniqueName: \"kubernetes.io/projected/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-kube-api-access-www96\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287111 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.286965 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287111 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287111 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287028 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287111 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-grpc-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287111 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287085 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-metrics-client-ca\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287413 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.287413 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287159 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.288060 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.287986 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-metrics-client-ca\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.289751 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.289702 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.289751 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.289706 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.289971 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.289955 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.290140 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.290119 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.290176 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.290155 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.290230 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.290214 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-secret-grpc-tls\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.295187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.295170 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-www96\" (UniqueName: \"kubernetes.io/projected/a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5-kube-api-access-www96\") pod \"thanos-querier-db764b9c9-q4t6s\" (UID: \"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5\") " pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.373897 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.373872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:28:22.439406 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.439307 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="759cd70eb3a75e2394ff24a83a8b7168601c51a96fa437a2a252bbe543c7ebec" exitCode=0 Apr 21 14:28:22.439838 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.439404 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"759cd70eb3a75e2394ff24a83a8b7168601c51a96fa437a2a252bbe543c7ebec"} Apr 21 14:28:22.441098 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.441075 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" event={"ID":"b6da1a60-6b4c-4412-b979-77b3fb44f43d","Type":"ContainerStarted","Data":"6eb81c3774031500bc0cf41adcaeb5b9d7d0e60b1c53129eae0c0863d84a7787"} Apr 21 14:28:22.441340 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.441312 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:28:22.442000 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.441961 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78bd496cff-9n9wh" Apr 21 14:28:22.444595 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.443124 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9bgw4" event={"ID":"0804186e-8a55-4aa4-9a48-ec393ed70e24","Type":"ContainerStarted","Data":"b5ef2f172ce956ae0737883ba6f61f584c37df180447d9615dd406834e5303f4"} Apr 21 14:28:22.444595 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.443147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9bgw4" event={"ID":"0804186e-8a55-4aa4-9a48-ec393ed70e24","Type":"ContainerStarted","Data":"7aab9538fcd686ea41fc6683b8235a1b80d428dd782e2f9858c2e8353d9eaa68"} Apr 21 14:28:22.488011 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.487970 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:22.500060 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.500011 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9bgw4" podStartSLOduration=2.57696099 podStartE2EDuration="3.499997469s" podCreationTimestamp="2026-04-21 14:28:19 +0000 UTC" firstStartedPulling="2026-04-21 14:28:20.161103997 +0000 UTC m=+179.710330402" lastFinishedPulling="2026-04-21 14:28:21.084140482 +0000 UTC m=+180.633366881" observedRunningTime="2026-04-21 14:28:22.498346812 +0000 UTC m=+182.047573270" watchObservedRunningTime="2026-04-21 14:28:22.499997469 +0000 UTC m=+182.049223890" Apr 21 14:28:22.610890 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:22.610863 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-db764b9c9-q4t6s"] Apr 21 14:28:22.613385 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:22.613359 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cb3be0_b8f9_40f2_8ce9_8d112541cfb5.slice/crio-753fac164dd38bb70c1482533cd3d3b675a11d322c764695482b02486375afff WatchSource:0}: Error finding container 753fac164dd38bb70c1482533cd3d3b675a11d322c764695482b02486375afff: Status 404 returned error can't find the container with id 753fac164dd38bb70c1482533cd3d3b675a11d322c764695482b02486375afff Apr 21 14:28:23.447269 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.447227 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"753fac164dd38bb70c1482533cd3d3b675a11d322c764695482b02486375afff"} Apr 21 14:28:23.500275 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.500238 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67ffd846c7-zsv56"] Apr 21 14:28:23.503811 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.503788 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.506023 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.505997 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 14:28:23.506982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.506954 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-41becqr8aq7u\"" Apr 21 14:28:23.506982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.506965 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 14:28:23.506982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.506978 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 14:28:23.507196 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.506964 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 14:28:23.507196 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.507078 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-cq8bl\"" Apr 21 14:28:23.512802 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.512686 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67ffd846c7-zsv56"] Apr 21 14:28:23.599541 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599504 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-client-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.599541 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599541 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhplb\" (UniqueName: \"kubernetes.io/projected/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-kube-api-access-mhplb\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.599781 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599564 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.599781 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-tls\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.599868 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599787 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-client-certs\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.599972 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.599946 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-metrics-server-audit-profiles\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.600026 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.600010 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-audit-log\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701042 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.700945 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-client-certs\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701203 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-metrics-server-audit-profiles\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701203 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701123 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-audit-log\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701203 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-client-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701203 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701169 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhplb\" (UniqueName: \"kubernetes.io/projected/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-kube-api-access-mhplb\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701203 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701199 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.701463 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.701226 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-tls\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.702290 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.702219 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-audit-log\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.702747 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.702685 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.702873 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.702809 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-metrics-server-audit-profiles\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.704036 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.703992 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-client-certs\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.704364 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.704337 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-secret-metrics-server-tls\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.704471 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.704443 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-client-ca-bundle\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.709811 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.709784 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhplb\" (UniqueName: \"kubernetes.io/projected/cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5-kube-api-access-mhplb\") pod \"metrics-server-67ffd846c7-zsv56\" (UID: \"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5\") " pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:23.819031 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:23.818992 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:24.228419 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.228398 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67ffd846c7-zsv56"] Apr 21 14:28:24.231220 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:24.231136 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8c72e3_8e7f_482a_817d_7b5e2fc81ab5.slice/crio-6daf3583c8e9a3a274790a9b63ad9b5938deb9d70518140138e5b24ea47612de WatchSource:0}: Error finding container 6daf3583c8e9a3a274790a9b63ad9b5938deb9d70518140138e5b24ea47612de: Status 404 returned error can't find the container with id 6daf3583c8e9a3a274790a9b63ad9b5938deb9d70518140138e5b24ea47612de Apr 21 14:28:24.453380 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.453341 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"6ec18e76d6af89060d2affa22b3e66e47e7bb847471cf05425b63955750dd310"} Apr 21 14:28:24.453778 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.453395 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"65162d5c23db95de29eb9a66d7f55622a41057406c97e61765aca0716b3146da"} Apr 21 14:28:24.453778 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.453411 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"ba476908950bfda252dfb8856ad4bb4aef65af8ee5d30f10b035f4e880affa1e"} Apr 21 14:28:24.453778 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.453423 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"a36850763588f250efbfd64f2bd9febed19270d6106121ea5ddcca66f7ad0601"} Apr 21 14:28:24.454570 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:24.454544 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" event={"ID":"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5","Type":"ContainerStarted","Data":"6daf3583c8e9a3a274790a9b63ad9b5938deb9d70518140138e5b24ea47612de"} Apr 21 14:28:25.453054 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.451793 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:28:25.458896 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.458208 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.462763 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462051 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 14:28:25.462763 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462213 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 14:28:25.462763 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462386 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wjsmt\"" Apr 21 14:28:25.462763 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462554 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 14:28:25.462763 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462645 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 14:28:25.463095 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.462854 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 14:28:25.463095 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.463029 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8mjhe40to8dsm\"" Apr 21 14:28:25.463201 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.463143 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 14:28:25.463486 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.463307 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 14:28:25.464519 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.464152 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 14:28:25.464831 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.464765 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 14:28:25.465165 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.465144 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 14:28:25.466271 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.466253 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 14:28:25.471418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.470841 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"fd9402a5baa82666c93dea025eca2a035789b82a0ede3a9ae4c9c1c653c8146a"} Apr 21 14:28:25.471418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.470897 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"773e2bf5915707d8da0cd162213be505d792d3bc6d39fc2febea527ff8836bfc"} Apr 21 14:28:25.471418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.470911 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"b15e865b098d970a288cfa0d94e092548153bf41e7f246d2c7536ababe3ecbd2"} Apr 21 14:28:25.473419 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.473381 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:28:25.474344 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.474017 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 14:28:25.477577 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.477540 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"240e75c1aebcf48a0abb64629585b7e542c307af398d18b5184971795f8967fd"} Apr 21 14:28:25.518959 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.518918 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.518980 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519010 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519032 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519057 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519138 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519165 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519208 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519232 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519320 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519357 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519391 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519389 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519719 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519413 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqpv5\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519719 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519444 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519719 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519473 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.519719 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.519496 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.620890 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.620845 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.620901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.620935 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.620958 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqpv5\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.620992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621016 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621040 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621112 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621150 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621174 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621197 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621221 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621247 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621275 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621310 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621379 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621335 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621827 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621827 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621411 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.621940 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.621918 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.622517 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.622182 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.622655 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.622528 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.626016 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.627049 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.627552 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.627681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.627756 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.628359 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.629034 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.630588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.630445 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.631174 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.630817 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.631174 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.631112 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.631429 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.631368 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.631429 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.631371 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.631773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.631712 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.632260 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.632238 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.635131 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.635089 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqpv5\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5\") pod \"prometheus-k8s-0\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.776009 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.775552 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:25.948296 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:25.948260 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:28:26.301198 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:28:26.301165 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7218f973_94dc_400c_8d39_65e605d6ae84.slice/crio-c1131f31ba52c778eb97905b8dac10f8eeed27dcf974a53e85e1e3919945dd96 WatchSource:0}: Error finding container c1131f31ba52c778eb97905b8dac10f8eeed27dcf974a53e85e1e3919945dd96: Status 404 returned error can't find the container with id c1131f31ba52c778eb97905b8dac10f8eeed27dcf974a53e85e1e3919945dd96 Apr 21 14:28:26.484016 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.483978 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerStarted","Data":"f92f50be38e5aae334916aa447b1e80ad501099fcdff813c71bc168743ea18b3"} Apr 21 14:28:26.485658 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.485626 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" event={"ID":"cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5","Type":"ContainerStarted","Data":"adf8edde05565e30b0383da2f13268ace26af6ac8a5d95fe1d132a77ab98710e"} Apr 21 14:28:26.487110 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.487084 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc" exitCode=0 Apr 21 14:28:26.487230 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.487171 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc"} Apr 21 14:28:26.487230 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.487205 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"c1131f31ba52c778eb97905b8dac10f8eeed27dcf974a53e85e1e3919945dd96"} Apr 21 14:28:26.489595 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.489574 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"939eeb89924a066562f220f8619cd2b4b9b4d8fd9526a8f918c75b030f0a7212"} Apr 21 14:28:26.512129 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.511960 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.487735565 podStartE2EDuration="6.511939004s" podCreationTimestamp="2026-04-21 14:28:20 +0000 UTC" firstStartedPulling="2026-04-21 14:28:20.727835554 +0000 UTC m=+180.277061968" lastFinishedPulling="2026-04-21 14:28:25.752038993 +0000 UTC m=+185.301265407" observedRunningTime="2026-04-21 14:28:26.509475502 +0000 UTC m=+186.058701923" watchObservedRunningTime="2026-04-21 14:28:26.511939004 +0000 UTC m=+186.061165427" Apr 21 14:28:26.558750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:26.558662 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" podStartSLOduration=1.430311431 podStartE2EDuration="3.558643422s" podCreationTimestamp="2026-04-21 14:28:23 +0000 UTC" firstStartedPulling="2026-04-21 14:28:24.233445956 +0000 UTC m=+183.782672368" lastFinishedPulling="2026-04-21 14:28:26.36177796 +0000 UTC m=+185.911004359" observedRunningTime="2026-04-21 14:28:26.556883753 +0000 UTC m=+186.106110175" watchObservedRunningTime="2026-04-21 14:28:26.558643422 +0000 UTC m=+186.107869844" Apr 21 14:28:27.495684 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:27.495646 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"205021067ba626da04d61dbbf36e59025bc138ad43595c3322d7e6a3035db9b0"} Apr 21 14:28:27.495684 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:27.495688 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" event={"ID":"a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5","Type":"ContainerStarted","Data":"8e6539044345415c9a97ccbd974d83b0110ec4f2bb6b047de6214fe51a008c47"} Apr 21 14:28:27.521219 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:27.521156 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" podStartSLOduration=1.774492597 podStartE2EDuration="5.521138738s" podCreationTimestamp="2026-04-21 14:28:22 +0000 UTC" firstStartedPulling="2026-04-21 14:28:22.615708521 +0000 UTC m=+182.164934926" lastFinishedPulling="2026-04-21 14:28:26.362354668 +0000 UTC m=+185.911581067" observedRunningTime="2026-04-21 14:28:27.51917178 +0000 UTC m=+187.068398200" watchObservedRunningTime="2026-04-21 14:28:27.521138738 +0000 UTC m=+187.070365150" Apr 21 14:28:28.499510 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:28.499477 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:29.505839 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:29.505800 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8"} Apr 21 14:28:29.506222 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:29.505850 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52"} Apr 21 14:28:29.506222 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:29.505866 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2"} Apr 21 14:28:30.512329 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:30.512295 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d"} Apr 21 14:28:30.512329 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:30.512332 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5"} Apr 21 14:28:30.512798 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:30.512343 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerStarted","Data":"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483"} Apr 21 14:28:30.539861 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:30.539801 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.780913793 podStartE2EDuration="5.539786775s" podCreationTimestamp="2026-04-21 14:28:25 +0000 UTC" firstStartedPulling="2026-04-21 14:28:26.488436808 +0000 UTC m=+186.037663207" lastFinishedPulling="2026-04-21 14:28:29.247309787 +0000 UTC m=+188.796536189" observedRunningTime="2026-04-21 14:28:30.538291692 +0000 UTC m=+190.087518123" watchObservedRunningTime="2026-04-21 14:28:30.539786775 +0000 UTC m=+190.089013195" Apr 21 14:28:30.776252 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:30.776160 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:28:34.511450 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:34.511420 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-db764b9c9-q4t6s" Apr 21 14:28:34.810545 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:34.810452 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:28:43.819648 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:43.819608 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:43.819648 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:43.819650 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:28:59.605761 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:59.605707 2583 generic.go:358] "Generic (PLEG): container finished" podID="43072f22-397f-42e1-a2fd-0d96e71b4412" containerID="6d9481bf81528bd5f44e09b09d190a2297892b6910345f67a6629c6bac45ff12" exitCode=0 Apr 21 14:28:59.606171 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:59.605785 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" event={"ID":"43072f22-397f-42e1-a2fd-0d96e71b4412","Type":"ContainerDied","Data":"6d9481bf81528bd5f44e09b09d190a2297892b6910345f67a6629c6bac45ff12"} Apr 21 14:28:59.606171 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:59.606141 2583 scope.go:117] "RemoveContainer" containerID="6d9481bf81528bd5f44e09b09d190a2297892b6910345f67a6629c6bac45ff12" Apr 21 14:28:59.830224 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:28:59.830181 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" podUID="149199e9-acc2-4844-9b85-9231431c2811" containerName="registry" containerID="cri-o://9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30" gracePeriod=30 Apr 21 14:29:00.075058 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.075033 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:29:00.147221 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147180 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147221 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147219 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147462 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147238 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzz5b\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147462 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147274 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147462 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147316 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147462 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147335 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147462 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147369 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.147713 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.147548 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca\") pod \"149199e9-acc2-4844-9b85-9231431c2811\" (UID: \"149199e9-acc2-4844-9b85-9231431c2811\") " Apr 21 14:29:00.148394 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.148340 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:00.148525 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.148421 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:00.150086 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.150060 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:00.150198 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.150110 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:00.150258 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.150223 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b" (OuterVolumeSpecName: "kube-api-access-zzz5b") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "kube-api-access-zzz5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:00.150433 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.150403 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:00.150552 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.150438 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:00.157208 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.157179 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "149199e9-acc2-4844-9b85-9231431c2811" (UID: "149199e9-acc2-4844-9b85-9231431c2811"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248598 2583 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-image-registry-private-configuration\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248629 2583 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/149199e9-acc2-4844-9b85-9231431c2811-ca-trust-extracted\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248639 2583 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/149199e9-acc2-4844-9b85-9231431c2811-installation-pull-secrets\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248651 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-trusted-ca\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248660 2583 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-bound-sa-token\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248669 2583 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-registry-tls\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248678 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzz5b\" (UniqueName: \"kubernetes.io/projected/149199e9-acc2-4844-9b85-9231431c2811-kube-api-access-zzz5b\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.248689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.248686 2583 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/149199e9-acc2-4844-9b85-9231431c2811-registry-certificates\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:00.609935 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.609898 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-24zfn" event={"ID":"43072f22-397f-42e1-a2fd-0d96e71b4412","Type":"ContainerStarted","Data":"f629879768ad425fc70af43a60323321120d335186bc14b4d792cb07a7977ff8"} Apr 21 14:29:00.611012 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.610989 2583 generic.go:358] "Generic (PLEG): container finished" podID="149199e9-acc2-4844-9b85-9231431c2811" containerID="9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30" exitCode=0 Apr 21 14:29:00.611127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.611045 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" Apr 21 14:29:00.611127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.611061 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" event={"ID":"149199e9-acc2-4844-9b85-9231431c2811","Type":"ContainerDied","Data":"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30"} Apr 21 14:29:00.611127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.611083 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f4494879d-mn7p6" event={"ID":"149199e9-acc2-4844-9b85-9231431c2811","Type":"ContainerDied","Data":"92ab120f0fc0869b2e5f896f07ab59a68387131b2464f5d1862b87b20201becb"} Apr 21 14:29:00.611127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.611109 2583 scope.go:117] "RemoveContainer" containerID="9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30" Apr 21 14:29:00.619572 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.619551 2583 scope.go:117] "RemoveContainer" containerID="9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30" Apr 21 14:29:00.619847 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:00.619826 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30\": container with ID starting with 9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30 not found: ID does not exist" containerID="9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30" Apr 21 14:29:00.619929 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.619862 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30"} err="failed to get container status \"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30\": rpc error: code = NotFound desc = could not find container \"9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30\": container with ID starting with 9f925f2051f63528bdeaa35dbc0c1e2045d06f5da6e5d8c33b5c66ba14f80a30 not found: ID does not exist" Apr 21 14:29:00.639905 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.639861 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:29:00.645053 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.645023 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f4494879d-mn7p6"] Apr 21 14:29:00.927527 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:00.927451 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149199e9-acc2-4844-9b85-9231431c2811" path="/var/lib/kubelet/pods/149199e9-acc2-4844-9b85-9231431c2811/volumes" Apr 21 14:29:03.825549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:03.825517 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:29:03.829396 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:03.829370 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67ffd846c7-zsv56" Apr 21 14:29:25.776578 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:25.776523 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:25.795869 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:25.795843 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:26.702588 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:26.702560 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:31.932001 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:31.931957 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:29:31.934351 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:31.934331 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425eadc2-ce6c-4aeb-9856-41d3b15c076b-metrics-certs\") pod \"network-metrics-daemon-bcph6\" (UID: \"425eadc2-ce6c-4aeb-9856-41d3b15c076b\") " pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:29:32.027456 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:32.027426 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9wlmf\"" Apr 21 14:29:32.035792 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:32.035756 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcph6" Apr 21 14:29:32.161166 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:32.161130 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcph6"] Apr 21 14:29:32.163867 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:29:32.163832 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod425eadc2_ce6c_4aeb_9856_41d3b15c076b.slice/crio-2e00ccba863aeee27165bf575501869bb4ae5b1eed9110a951360b7ea36e8ca2 WatchSource:0}: Error finding container 2e00ccba863aeee27165bf575501869bb4ae5b1eed9110a951360b7ea36e8ca2: Status 404 returned error can't find the container with id 2e00ccba863aeee27165bf575501869bb4ae5b1eed9110a951360b7ea36e8ca2 Apr 21 14:29:32.705417 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:32.705376 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcph6" event={"ID":"425eadc2-ce6c-4aeb-9856-41d3b15c076b","Type":"ContainerStarted","Data":"2e00ccba863aeee27165bf575501869bb4ae5b1eed9110a951360b7ea36e8ca2"} Apr 21 14:29:33.710940 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:33.710903 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcph6" event={"ID":"425eadc2-ce6c-4aeb-9856-41d3b15c076b","Type":"ContainerStarted","Data":"b6b51fcc4d9d6721f5437e97045fe3e77c7768727465664956f2c327c6052c1b"} Apr 21 14:29:33.710940 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:33.710943 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcph6" event={"ID":"425eadc2-ce6c-4aeb-9856-41d3b15c076b","Type":"ContainerStarted","Data":"ebfc03555c64ed0cfc46291a202f6cfc55be4183ddb980397dcfb2233f8fadb2"} Apr 21 14:29:33.727522 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:33.727467 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bcph6" podStartSLOduration=251.743041131 podStartE2EDuration="4m12.727451205s" podCreationTimestamp="2026-04-21 14:25:21 +0000 UTC" firstStartedPulling="2026-04-21 14:29:32.165743396 +0000 UTC m=+251.714969798" lastFinishedPulling="2026-04-21 14:29:33.150153469 +0000 UTC m=+252.699379872" observedRunningTime="2026-04-21 14:29:33.725612562 +0000 UTC m=+253.274838993" watchObservedRunningTime="2026-04-21 14:29:33.727451205 +0000 UTC m=+253.276677625" Apr 21 14:29:43.612184 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612087 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:43.612630 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612583 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="alertmanager" containerID="cri-o://a36850763588f250efbfd64f2bd9febed19270d6106121ea5ddcca66f7ad0601" gracePeriod=120 Apr 21 14:29:43.612771 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612657 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-metric" containerID="cri-o://240e75c1aebcf48a0abb64629585b7e542c307af398d18b5184971795f8967fd" gracePeriod=120 Apr 21 14:29:43.612771 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612670 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy" containerID="cri-o://6ec18e76d6af89060d2affa22b3e66e47e7bb847471cf05425b63955750dd310" gracePeriod=120 Apr 21 14:29:43.612771 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612711 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="config-reloader" containerID="cri-o://ba476908950bfda252dfb8856ad4bb4aef65af8ee5d30f10b035f4e880affa1e" gracePeriod=120 Apr 21 14:29:43.612771 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612655 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-web" containerID="cri-o://65162d5c23db95de29eb9a66d7f55622a41057406c97e61765aca0716b3146da" gracePeriod=120 Apr 21 14:29:43.612960 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.612694 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="prom-label-proxy" containerID="cri-o://f92f50be38e5aae334916aa447b1e80ad501099fcdff813c71bc168743ea18b3" gracePeriod=120 Apr 21 14:29:43.747236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747204 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="f92f50be38e5aae334916aa447b1e80ad501099fcdff813c71bc168743ea18b3" exitCode=0 Apr 21 14:29:43.747236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747231 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="6ec18e76d6af89060d2affa22b3e66e47e7bb847471cf05425b63955750dd310" exitCode=0 Apr 21 14:29:43.747236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747239 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="ba476908950bfda252dfb8856ad4bb4aef65af8ee5d30f10b035f4e880affa1e" exitCode=0 Apr 21 14:29:43.747236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747245 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="a36850763588f250efbfd64f2bd9febed19270d6106121ea5ddcca66f7ad0601" exitCode=0 Apr 21 14:29:43.747514 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747279 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"f92f50be38e5aae334916aa447b1e80ad501099fcdff813c71bc168743ea18b3"} Apr 21 14:29:43.747514 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747321 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"6ec18e76d6af89060d2affa22b3e66e47e7bb847471cf05425b63955750dd310"} Apr 21 14:29:43.747514 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747335 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"ba476908950bfda252dfb8856ad4bb4aef65af8ee5d30f10b035f4e880affa1e"} Apr 21 14:29:43.747514 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:43.747347 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"a36850763588f250efbfd64f2bd9febed19270d6106121ea5ddcca66f7ad0601"} Apr 21 14:29:44.753963 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.753930 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="240e75c1aebcf48a0abb64629585b7e542c307af398d18b5184971795f8967fd" exitCode=0 Apr 21 14:29:44.753963 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.753956 2583 generic.go:358] "Generic (PLEG): container finished" podID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerID="65162d5c23db95de29eb9a66d7f55622a41057406c97e61765aca0716b3146da" exitCode=0 Apr 21 14:29:44.754354 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.754003 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"240e75c1aebcf48a0abb64629585b7e542c307af398d18b5184971795f8967fd"} Apr 21 14:29:44.754354 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.754043 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"65162d5c23db95de29eb9a66d7f55622a41057406c97e61765aca0716b3146da"} Apr 21 14:29:44.867880 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.867855 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:44.944875 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944789 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.944875 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944841 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.944875 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944873 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944902 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944932 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.944965 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945001 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945033 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945058 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945102 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4twg\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945128 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945497 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945162 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945497 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945198 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8e8a9c06-7424-4932-8589-906b80d6f3fe\" (UID: \"8e8a9c06-7424-4932-8589-906b80d6f3fe\") " Apr 21 14:29:44.945497 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945262 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:29:44.945646 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.945523 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-main-db\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:44.946184 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.946151 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:44.946523 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.946495 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:44.947712 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.947637 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.947856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.947752 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.947928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.947896 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg" (OuterVolumeSpecName: "kube-api-access-h4twg") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "kube-api-access-h4twg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:44.948523 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.948489 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.948928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.948894 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.949127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.948972 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.949127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.949067 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out" (OuterVolumeSpecName: "config-out") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:29:44.949350 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.949332 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:44.953428 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.953389 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:44.958922 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:44.958896 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config" (OuterVolumeSpecName: "web-config") pod "8e8a9c06-7424-4932-8589-906b80d6f3fe" (UID: "8e8a9c06-7424-4932-8589-906b80d6f3fe"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:45.046453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046404 2583 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-cluster-tls-config\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046447 2583 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-metrics-client-ca\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046459 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-tls-assets\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046467 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4twg\" (UniqueName: \"kubernetes.io/projected/8e8a9c06-7424-4932-8589-906b80d6f3fe-kube-api-access-h4twg\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046476 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-out\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046488 2583 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-config-volume\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046497 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046507 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-main-tls\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046516 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-web-config\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046524 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046534 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8a9c06-7424-4932-8589-906b80d6f3fe-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.046700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.046543 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8a9c06-7424-4932-8589-906b80d6f3fe-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:45.760509 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.760472 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e8a9c06-7424-4932-8589-906b80d6f3fe","Type":"ContainerDied","Data":"20b43327f59b6e980bb81aa5114275fabd91fe687c4de51dbd2083a3f2a724c8"} Apr 21 14:29:45.760509 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.760518 2583 scope.go:117] "RemoveContainer" containerID="f92f50be38e5aae334916aa447b1e80ad501099fcdff813c71bc168743ea18b3" Apr 21 14:29:45.761036 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.760551 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.768838 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.768817 2583 scope.go:117] "RemoveContainer" containerID="240e75c1aebcf48a0abb64629585b7e542c307af398d18b5184971795f8967fd" Apr 21 14:29:45.776235 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.776213 2583 scope.go:117] "RemoveContainer" containerID="6ec18e76d6af89060d2affa22b3e66e47e7bb847471cf05425b63955750dd310" Apr 21 14:29:45.783470 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.783446 2583 scope.go:117] "RemoveContainer" containerID="65162d5c23db95de29eb9a66d7f55622a41057406c97e61765aca0716b3146da" Apr 21 14:29:45.785943 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.785920 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:45.789466 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.789441 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:45.791792 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.791772 2583 scope.go:117] "RemoveContainer" containerID="ba476908950bfda252dfb8856ad4bb4aef65af8ee5d30f10b035f4e880affa1e" Apr 21 14:29:45.798331 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.798314 2583 scope.go:117] "RemoveContainer" containerID="a36850763588f250efbfd64f2bd9febed19270d6106121ea5ddcca66f7ad0601" Apr 21 14:29:45.804898 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.804876 2583 scope.go:117] "RemoveContainer" containerID="759cd70eb3a75e2394ff24a83a8b7168601c51a96fa437a2a252bbe543c7ebec" Apr 21 14:29:45.813686 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.813662 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:45.814075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814058 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="init-config-reloader" Apr 21 14:29:45.814075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814077 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="init-config-reloader" Apr 21 14:29:45.814075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814084 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-metric" Apr 21 14:29:45.814075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814090 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-metric" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814100 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-web" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814106 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-web" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814114 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814119 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814128 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="config-reloader" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814133 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="config-reloader" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814142 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="149199e9-acc2-4844-9b85-9231431c2811" containerName="registry" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814147 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="149199e9-acc2-4844-9b85-9231431c2811" containerName="registry" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814159 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="prom-label-proxy" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814164 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="prom-label-proxy" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814171 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="alertmanager" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814178 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="alertmanager" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814221 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-metric" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814231 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="149199e9-acc2-4844-9b85-9231431c2811" containerName="registry" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814237 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="prom-label-proxy" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814243 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="config-reloader" Apr 21 14:29:45.814243 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814250 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy" Apr 21 14:29:45.814700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814256 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="alertmanager" Apr 21 14:29:45.814700 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.814263 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" containerName="kube-rbac-proxy-web" Apr 21 14:29:45.818972 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.818950 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.821673 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821642 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 14:29:45.821801 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821757 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s7wg6\"" Apr 21 14:29:45.821801 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821782 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 14:29:45.821914 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821812 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 14:29:45.821968 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821926 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 14:29:45.821968 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.821927 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 14:29:45.822052 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.822004 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 14:29:45.822052 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.822023 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 14:29:45.822151 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.822074 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 14:29:45.827958 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.827715 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 14:29:45.829306 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.829282 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:45.853499 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853468 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853514 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853538 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853577 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqf8\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-kube-api-access-nzqf8\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.853758 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853755 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.854150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853809 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.854150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853841 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-web-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.854150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853880 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.854150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.853989 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.854150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.854040 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-out\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955011 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955100 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955076 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-web-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955377 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955377 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955139 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955377 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955260 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-out\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.955377 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955347 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958184 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.955380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958184 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958082 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958249 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958361 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958547 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958354 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958547 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958362 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqf8\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-kube-api-access-nzqf8\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.958547 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958528 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-web-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.959773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.959773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.958819 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.959773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.959116 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.959773 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.959491 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-volume\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.964852 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.964800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab2c0316-53f9-4129-9cc1-a7970a962cb9-config-out\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.964977 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.964954 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.965044 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.965028 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.965088 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.965035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.965370 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.965349 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ab2c0316-53f9-4129-9cc1-a7970a962cb9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.965783 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.965763 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab2c0316-53f9-4129-9cc1-a7970a962cb9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:45.966950 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:45.966933 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqf8\" (UniqueName: \"kubernetes.io/projected/ab2c0316-53f9-4129-9cc1-a7970a962cb9-kube-api-access-nzqf8\") pod \"alertmanager-main-0\" (UID: \"ab2c0316-53f9-4129-9cc1-a7970a962cb9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:46.130652 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.130613 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:29:46.263318 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.263148 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:29:46.266069 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:29:46.266036 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2c0316_53f9_4129_9cc1_a7970a962cb9.slice/crio-0d1ad9dff061d8446b06c9510a25bb9ddb0c2e59dd0ed315cfd87377d46ce1bc WatchSource:0}: Error finding container 0d1ad9dff061d8446b06c9510a25bb9ddb0c2e59dd0ed315cfd87377d46ce1bc: Status 404 returned error can't find the container with id 0d1ad9dff061d8446b06c9510a25bb9ddb0c2e59dd0ed315cfd87377d46ce1bc Apr 21 14:29:46.765081 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.765042 2583 generic.go:358] "Generic (PLEG): container finished" podID="ab2c0316-53f9-4129-9cc1-a7970a962cb9" containerID="0ef22bd2952f0c339226f0252b2d22d5cca58baf01f3a7dd007714615dba6ee7" exitCode=0 Apr 21 14:29:46.765511 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.765143 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerDied","Data":"0ef22bd2952f0c339226f0252b2d22d5cca58baf01f3a7dd007714615dba6ee7"} Apr 21 14:29:46.765511 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.765177 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"0d1ad9dff061d8446b06c9510a25bb9ddb0c2e59dd0ed315cfd87377d46ce1bc"} Apr 21 14:29:46.928841 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:46.928809 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8a9c06-7424-4932-8589-906b80d6f3fe" path="/var/lib/kubelet/pods/8e8a9c06-7424-4932-8589-906b80d6f3fe/volumes" Apr 21 14:29:47.654618 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.654580 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-86c4c66c-rfkzl"] Apr 21 14:29:47.658183 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.658156 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.660821 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.660801 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-fjglr\"" Apr 21 14:29:47.660964 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.660834 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 14:29:47.661065 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.661049 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 14:29:47.661124 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.661103 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 14:29:47.661562 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.661545 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 14:29:47.661653 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.661571 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 14:29:47.667068 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.667036 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 14:29:47.669924 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.669899 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-86c4c66c-rfkzl"] Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773242 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"eaf90d247bb60d6d5818b731d3f700c964d46ba282dad2a087968ffb8c413440"} Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773284 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"5b21fc8b2e32751a7a541d1fe2a8de403f3928c3f2bb6d6c22e9b94c85f7a121"} Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773298 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"0440fb7b36577673a59023c3956ad37bdce70bc40dfbdb7c35352a4a5b97b744"} Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773310 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"f59a4b1cc7b67cb3f4ea0536f1d387262a2489f5e0172f1a52824297378fd3fa"} Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773323 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"87a3d46dcde3c6290b4d7687fbf012dcbc8ae4ca0079ea11eae794ea9289b66c"} Apr 21 14:29:47.774187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.773335 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ab2c0316-53f9-4129-9cc1-a7970a962cb9","Type":"ContainerStarted","Data":"5e79187bffede29a726b26d68e5e80422c5034593a436f20ab77671a1885652a"} Apr 21 14:29:47.775785 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.775761 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-metrics-client-ca\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.775867 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.775851 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.775923 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.775908 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.775968 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.775943 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.776047 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.776022 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-serving-certs-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.776088 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.776072 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnfg\" (UniqueName: \"kubernetes.io/projected/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-kube-api-access-qtnfg\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.776123 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.776113 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-federate-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.776159 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.776130 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.802132 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.802069 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.802048584 podStartE2EDuration="2.802048584s" podCreationTimestamp="2026-04-21 14:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:29:47.800688835 +0000 UTC m=+267.349915259" watchObservedRunningTime="2026-04-21 14:29:47.802048584 +0000 UTC m=+267.351275005" Apr 21 14:29:47.876737 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.876681 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-metrics-client-ca\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.876930 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.876863 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877057 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877022 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877158 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877063 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877213 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877154 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-serving-certs-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877213 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877188 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnfg\" (UniqueName: \"kubernetes.io/projected/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-kube-api-access-qtnfg\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877311 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877245 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-federate-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877311 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877282 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.877811 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.877770 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-metrics-client-ca\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.878211 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.878187 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-serving-certs-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.878372 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.878272 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.880513 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.880408 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-telemeter-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.880903 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.880882 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-federate-client-tls\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.881685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.881660 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.881942 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.881920 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.888127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.888102 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnfg\" (UniqueName: \"kubernetes.io/projected/ca1cc24d-4246-470d-b19e-e1e3c38e3d5f-kube-api-access-qtnfg\") pod \"telemeter-client-86c4c66c-rfkzl\" (UID: \"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f\") " pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.970179 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.970077 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" Apr 21 14:29:47.980369 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.980334 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.980943 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="prometheus" containerID="cri-o://7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2" gracePeriod=600 Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.980958 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy" containerID="cri-o://0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5" gracePeriod=600 Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.980965 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="thanos-sidecar" containerID="cri-o://dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8" gracePeriod=600 Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.981058 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-web" containerID="cri-o://baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483" gracePeriod=600 Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.981113 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d" gracePeriod=600 Apr 21 14:29:47.981236 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:47.981158 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="config-reloader" containerID="cri-o://37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52" gracePeriod=600 Apr 21 14:29:48.139921 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.139829 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-86c4c66c-rfkzl"] Apr 21 14:29:48.143034 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:29:48.143003 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1cc24d_4246_470d_b19e_e1e3c38e3d5f.slice/crio-44b827bbdd38cf2fe6b6b9e082ecb06f92f002e198beff367a2ca76d39c0b13f WatchSource:0}: Error finding container 44b827bbdd38cf2fe6b6b9e082ecb06f92f002e198beff367a2ca76d39c0b13f: Status 404 returned error can't find the container with id 44b827bbdd38cf2fe6b6b9e082ecb06f92f002e198beff367a2ca76d39c0b13f Apr 21 14:29:48.777424 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.777376 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" event={"ID":"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f","Type":"ContainerStarted","Data":"44b827bbdd38cf2fe6b6b9e082ecb06f92f002e198beff367a2ca76d39c0b13f"} Apr 21 14:29:48.780178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780152 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d" exitCode=0 Apr 21 14:29:48.780178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780174 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5" exitCode=0 Apr 21 14:29:48.780178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780181 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8" exitCode=0 Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780188 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52" exitCode=0 Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780196 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2" exitCode=0 Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780224 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d"} Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780263 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5"} Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780276 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8"} Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780288 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52"} Apr 21 14:29:48.780393 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:48.780302 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2"} Apr 21 14:29:49.237416 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.237392 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.291993 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.291901 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.291993 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.291954 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.292206 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292007 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.292408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292385 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.292636 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292620 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.292837 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292822 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293006 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292840 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:49.293091 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.292977 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293091 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293069 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293201 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293115 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293201 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293159 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293201 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293191 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293343 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293227 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293343 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293256 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293343 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293288 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293343 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293316 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293537 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293342 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293537 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293368 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293537 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293395 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqpv5\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5\") pod \"7218f973-94dc-400c-8d39-65e605d6ae84\" (UID: \"7218f973-94dc-400c-8d39-65e605d6ae84\") " Apr 21 14:29:49.293689 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293628 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:49.294020 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293759 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.294020 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.293793 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.294982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.294476 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:49.295315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.295234 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:29:49.295958 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.295891 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:49.296940 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.296913 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:29:49.297555 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.297526 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.297820 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.297717 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:49.297925 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.297903 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out" (OuterVolumeSpecName: "config-out") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:29:49.298476 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.298437 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.298614 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.298591 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config" (OuterVolumeSpecName: "config") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.299140 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.299084 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.299140 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.299061 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.299622 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.299579 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5" (OuterVolumeSpecName: "kube-api-access-mqpv5") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "kube-api-access-mqpv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:29:49.303398 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.300012 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.303398 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.300480 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.304593 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.304561 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.315366 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.315333 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config" (OuterVolumeSpecName: "web-config") pod "7218f973-94dc-400c-8d39-65e605d6ae84" (UID: "7218f973-94dc-400c-8d39-65e605d6ae84"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:29:49.395231 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395184 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395231 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395229 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395231 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395243 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqpv5\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-kube-api-access-mqpv5\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395253 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7218f973-94dc-400c-8d39-65e605d6ae84-tls-assets\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395264 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-kube-rbac-proxy\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395273 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395283 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-metrics-client-certs\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395297 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-configmap-metrics-client-ca\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395313 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395328 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-k8s-db\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395342 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-grpc-tls\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395354 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395363 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-config\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395372 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7218f973-94dc-400c-8d39-65e605d6ae84-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395380 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7218f973-94dc-400c-8d39-65e605d6ae84-config-out\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.395473 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.395388 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7218f973-94dc-400c-8d39-65e605d6ae84-web-config\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:29:49.786274 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.786238 2583 generic.go:358] "Generic (PLEG): container finished" podID="7218f973-94dc-400c-8d39-65e605d6ae84" containerID="baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483" exitCode=0 Apr 21 14:29:49.786671 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.786315 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483"} Apr 21 14:29:49.786671 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.786347 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.786671 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.786368 2583 scope.go:117] "RemoveContainer" containerID="c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d" Apr 21 14:29:49.786671 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.786354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7218f973-94dc-400c-8d39-65e605d6ae84","Type":"ContainerDied","Data":"c1131f31ba52c778eb97905b8dac10f8eeed27dcf974a53e85e1e3919945dd96"} Apr 21 14:29:49.794564 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.794542 2583 scope.go:117] "RemoveContainer" containerID="0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5" Apr 21 14:29:49.801160 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.801143 2583 scope.go:117] "RemoveContainer" containerID="baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483" Apr 21 14:29:49.807642 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.807620 2583 scope.go:117] "RemoveContainer" containerID="dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8" Apr 21 14:29:49.810387 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.810356 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:49.814819 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.814798 2583 scope.go:117] "RemoveContainer" containerID="37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52" Apr 21 14:29:49.817328 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.817306 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:49.823120 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.823102 2583 scope.go:117] "RemoveContainer" containerID="7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2" Apr 21 14:29:49.831587 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.831560 2583 scope.go:117] "RemoveContainer" containerID="e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc" Apr 21 14:29:49.838258 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.838228 2583 scope.go:117] "RemoveContainer" containerID="c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d" Apr 21 14:29:49.838498 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.838478 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d\": container with ID starting with c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d not found: ID does not exist" containerID="c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d" Apr 21 14:29:49.838620 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.838511 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d"} err="failed to get container status \"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d\": rpc error: code = NotFound desc = could not find container \"c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d\": container with ID starting with c6a9eab84ad37d2f86213f696fdf75f0b11e12d65bff36486e061ed401ced78d not found: ID does not exist" Apr 21 14:29:49.838620 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.838539 2583 scope.go:117] "RemoveContainer" containerID="0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5" Apr 21 14:29:49.838802 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.838785 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5\": container with ID starting with 0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5 not found: ID does not exist" containerID="0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5" Apr 21 14:29:49.838847 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.838809 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5"} err="failed to get container status \"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5\": rpc error: code = NotFound desc = could not find container \"0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5\": container with ID starting with 0eff4983322f3fb6ca529fcf1f32d0b033fdcf1aaf8f9a438eebb9784ccf20c5 not found: ID does not exist" Apr 21 14:29:49.838847 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.838829 2583 scope.go:117] "RemoveContainer" containerID="baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483" Apr 21 14:29:49.839077 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.839056 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483\": container with ID starting with baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483 not found: ID does not exist" containerID="baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483" Apr 21 14:29:49.839118 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839082 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483"} err="failed to get container status \"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483\": rpc error: code = NotFound desc = could not find container \"baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483\": container with ID starting with baff6e96cf63f4409ad11bf606c42ec1d2e9b992f5cdae25b1082456d883c483 not found: ID does not exist" Apr 21 14:29:49.839118 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839097 2583 scope.go:117] "RemoveContainer" containerID="dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8" Apr 21 14:29:49.839309 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.839292 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8\": container with ID starting with dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8 not found: ID does not exist" containerID="dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8" Apr 21 14:29:49.839370 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839317 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8"} err="failed to get container status \"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8\": rpc error: code = NotFound desc = could not find container \"dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8\": container with ID starting with dacf51e303cb34647828b64039fe08557d29dbb90ce5a94eae6d1272be4236f8 not found: ID does not exist" Apr 21 14:29:49.839370 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839337 2583 scope.go:117] "RemoveContainer" containerID="37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52" Apr 21 14:29:49.839568 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.839550 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52\": container with ID starting with 37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52 not found: ID does not exist" containerID="37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52" Apr 21 14:29:49.839608 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839572 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52"} err="failed to get container status \"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52\": rpc error: code = NotFound desc = could not find container \"37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52\": container with ID starting with 37b00b74c05e3103536c82c0cc6c51e98130f5ed48746d7c602ca52c223d3e52 not found: ID does not exist" Apr 21 14:29:49.839608 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839586 2583 scope.go:117] "RemoveContainer" containerID="7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2" Apr 21 14:29:49.839809 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.839793 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2\": container with ID starting with 7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2 not found: ID does not exist" containerID="7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2" Apr 21 14:29:49.839856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839812 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2"} err="failed to get container status \"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2\": rpc error: code = NotFound desc = could not find container \"7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2\": container with ID starting with 7aec14f58fb0a1d075b151996726ad8a1dfdce7be76109e6d3fd05eb3c0b97e2 not found: ID does not exist" Apr 21 14:29:49.839856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.839826 2583 scope.go:117] "RemoveContainer" containerID="e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc" Apr 21 14:29:49.840041 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:49.840024 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc\": container with ID starting with e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc not found: ID does not exist" containerID="e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc" Apr 21 14:29:49.840086 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.840047 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc"} err="failed to get container status \"e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc\": rpc error: code = NotFound desc = could not find container \"e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc\": container with ID starting with e83ff6d86eb9e3722d06689781280e93c2a2d8c7416a99fb0b3525b2083d3dcc not found: ID does not exist" Apr 21 14:29:49.844963 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.844940 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:49.845283 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845267 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845286 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845304 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="config-reloader" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845313 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="config-reloader" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845330 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="thanos-sidecar" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845339 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="thanos-sidecar" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845355 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="prometheus" Apr 21 14:29:49.845362 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845363 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="prometheus" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845379 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-thanos" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845388 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-thanos" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845399 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="init-config-reloader" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845408 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="init-config-reloader" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845417 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-web" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845425 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-web" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845510 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845522 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="config-reloader" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845533 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-thanos" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845546 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="thanos-sidecar" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845559 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="prometheus" Apr 21 14:29:49.845679 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.845568 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" containerName="kube-rbac-proxy-web" Apr 21 14:29:49.851054 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.851034 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.853619 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.853581 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 14:29:49.853918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.853771 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 14:29:49.853918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.853775 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 14:29:49.853918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.853894 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 14:29:49.854163 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854135 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 14:29:49.854225 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854183 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 14:29:49.854275 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854237 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 14:29:49.854275 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854249 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 14:29:49.854397 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8mjhe40to8dsm\"" Apr 21 14:29:49.854455 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854380 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wjsmt\"" Apr 21 14:29:49.854524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854509 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 14:29:49.854786 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.854766 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 14:29:49.856305 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.856284 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 14:29:49.858811 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.858759 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 14:29:49.864214 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.864181 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:49.902250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902211 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902250 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902250 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902276 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902322 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902367 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902383 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902453 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902424 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902474 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902496 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902511 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6dgb\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-kube-api-access-s6dgb\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902553 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902587 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902610 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902627 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902653 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902692 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:49.902718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:49.902720 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004000 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.003902 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004000 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.003951 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004000 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.003986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004016 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004072 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004121 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004151 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004184 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004207 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004249 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004231 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6dgb\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-kube-api-access-s6dgb\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004286 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004313 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004374 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004406 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004454 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.004584 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.004490 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.005166 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.005100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.005166 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.005131 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.005554 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.005527 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.007645 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.007333 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.008416 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.008370 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.008540 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.008518 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.008737 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.008629 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.008925 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.008859 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.009104 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.009079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.009377 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.009347 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.010121 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.010079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.010741 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.010613 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.010824 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.010770 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.011247 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.011221 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.011696 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.011649 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.011803 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.011717 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.012222 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.012200 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.013846 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.013826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6dgb\" (UniqueName: \"kubernetes.io/projected/7ce890dd-192a-44d8-9ab9-7597bb0d75a5-kube-api-access-s6dgb\") pod \"prometheus-k8s-0\" (UID: \"7ce890dd-192a-44d8-9ab9-7597bb0d75a5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.161980 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.161937 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:50.310439 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.310415 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 14:29:50.312343 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:29:50.312318 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce890dd_192a_44d8_9ab9_7597bb0d75a5.slice/crio-4f771e1fce7e48fa68b8f96255357c80147d30135cf09fb357cf2123c4344282 WatchSource:0}: Error finding container 4f771e1fce7e48fa68b8f96255357c80147d30135cf09fb357cf2123c4344282: Status 404 returned error can't find the container with id 4f771e1fce7e48fa68b8f96255357c80147d30135cf09fb357cf2123c4344282 Apr 21 14:29:50.792078 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.792040 2583 generic.go:358] "Generic (PLEG): container finished" podID="7ce890dd-192a-44d8-9ab9-7597bb0d75a5" containerID="308e6abf7ddd02ce646f09c1e7b7001f8a8db46f468c8d7547cfa01e0de409be" exitCode=0 Apr 21 14:29:50.792529 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.792126 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerDied","Data":"308e6abf7ddd02ce646f09c1e7b7001f8a8db46f468c8d7547cfa01e0de409be"} Apr 21 14:29:50.792529 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.792181 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"4f771e1fce7e48fa68b8f96255357c80147d30135cf09fb357cf2123c4344282"} Apr 21 14:29:50.929706 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:50.929685 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7218f973-94dc-400c-8d39-65e605d6ae84" path="/var/lib/kubelet/pods/7218f973-94dc-400c-8d39-65e605d6ae84/volumes" Apr 21 14:29:51.798129 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.798081 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" event={"ID":"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f","Type":"ContainerStarted","Data":"b134747d2d62f3a0be369a0c49dbc11688078e4651f0ebb37bf1573ec093b99c"} Apr 21 14:29:51.798129 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.798133 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" event={"ID":"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f","Type":"ContainerStarted","Data":"3f490e5c081bed8f0b11eb56dc5230afdc6eef0366f64c50c7a48541ce496792"} Apr 21 14:29:51.798844 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.798149 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" event={"ID":"ca1cc24d-4246-470d-b19e-e1e3c38e3d5f","Type":"ContainerStarted","Data":"da31a5d584dca6aec2720f1ae6bba7e90afa78591c73454649d25defd07f2895"} Apr 21 14:29:51.800907 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800876 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"de514c74894356f451ea561b3cfd45cccdda3eb687cda7d5de8703fc79495344"} Apr 21 14:29:51.801025 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800915 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"2bbf0f851fc713fb3d99d4277739bc0be1e2007102be89e8e50bf14479d1443e"} Apr 21 14:29:51.801025 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800928 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"8dbe50cbfc8ee3ba5e9e533471241db272a8f5bc01752cc446ff96bc5521789c"} Apr 21 14:29:51.801025 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800939 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"adc80d17e68b04374351c28ef2b1cfd5a6a30c0258f0aff6a2c054fc099c8e2b"} Apr 21 14:29:51.801025 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800950 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"7203c726c95175d41f51d54c8f19238eac63b81dddca64d741551a724b105c17"} Apr 21 14:29:51.801025 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.800961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ce890dd-192a-44d8-9ab9-7597bb0d75a5","Type":"ContainerStarted","Data":"4450ad9b4017e1c2240fd19bd5017b5b2d2b7bc8be1b0ebfb6c94e4a8c6375fc"} Apr 21 14:29:51.822083 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.822036 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-86c4c66c-rfkzl" podStartSLOduration=2.056944514 podStartE2EDuration="4.822021251s" podCreationTimestamp="2026-04-21 14:29:47 +0000 UTC" firstStartedPulling="2026-04-21 14:29:48.144915916 +0000 UTC m=+267.694142315" lastFinishedPulling="2026-04-21 14:29:50.909992645 +0000 UTC m=+270.459219052" observedRunningTime="2026-04-21 14:29:51.819842986 +0000 UTC m=+271.369069407" watchObservedRunningTime="2026-04-21 14:29:51.822021251 +0000 UTC m=+271.371247671" Apr 21 14:29:51.847709 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:51.847663 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.847648793 podStartE2EDuration="2.847648793s" podCreationTimestamp="2026-04-21 14:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:29:51.846342445 +0000 UTC m=+271.395568866" watchObservedRunningTime="2026-04-21 14:29:51.847648793 +0000 UTC m=+271.396875215" Apr 21 14:29:55.162816 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:55.162756 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:29:59.354326 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:29:59.354272 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" podUID="52f4ef6e-0001-42d5-acda-d8d1b7ce4e20" Apr 21 14:29:59.827066 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:29:59.827037 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:30:02.422745 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.422680 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:30:02.425130 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.425104 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/52f4ef6e-0001-42d5-acda-d8d1b7ce4e20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bcjt5\" (UID: \"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:30:02.524219 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.524180 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:30:02.526605 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.526582 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33175606-6499-415b-b273-193922870d52-cert\") pod \"ingress-canary-9pxm9\" (UID: \"33175606-6499-415b-b273-193922870d52\") " pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:30:02.530076 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.530049 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-j7vqv\"" Apr 21 14:30:02.538181 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.538151 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" Apr 21 14:30:02.627779 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.627714 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pqf56\"" Apr 21 14:30:02.635861 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.635819 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9pxm9" Apr 21 14:30:02.665429 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.665401 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5"] Apr 21 14:30:02.667597 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:30:02.667556 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f4ef6e_0001_42d5_acda_d8d1b7ce4e20.slice/crio-f0833b7cc3dd89fec67c789db05806e7a4b6eef3e585a247d31ec63f1f55735e WatchSource:0}: Error finding container f0833b7cc3dd89fec67c789db05806e7a4b6eef3e585a247d31ec63f1f55735e: Status 404 returned error can't find the container with id f0833b7cc3dd89fec67c789db05806e7a4b6eef3e585a247d31ec63f1f55735e Apr 21 14:30:02.758856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.758831 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9pxm9"] Apr 21 14:30:02.761256 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:30:02.761228 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33175606_6499_415b_b273_193922870d52.slice/crio-610cfe75ed258d38ef8bd14fe9231326bbb9046fbe1b2ed267f7851b4f33acc8 WatchSource:0}: Error finding container 610cfe75ed258d38ef8bd14fe9231326bbb9046fbe1b2ed267f7851b4f33acc8: Status 404 returned error can't find the container with id 610cfe75ed258d38ef8bd14fe9231326bbb9046fbe1b2ed267f7851b4f33acc8 Apr 21 14:30:02.837076 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.837035 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" event={"ID":"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20","Type":"ContainerStarted","Data":"f0833b7cc3dd89fec67c789db05806e7a4b6eef3e585a247d31ec63f1f55735e"} Apr 21 14:30:02.838033 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:02.838010 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9pxm9" event={"ID":"33175606-6499-415b-b273-193922870d52","Type":"ContainerStarted","Data":"610cfe75ed258d38ef8bd14fe9231326bbb9046fbe1b2ed267f7851b4f33acc8"} Apr 21 14:30:03.842638 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:03.842595 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" event={"ID":"52f4ef6e-0001-42d5-acda-d8d1b7ce4e20","Type":"ContainerStarted","Data":"a1eefda2fb597aae9db00c1de99323ded32e90ed00ed031eef14ab4302859c04"} Apr 21 14:30:03.858658 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:03.858605 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bcjt5" podStartSLOduration=274.848270601 podStartE2EDuration="4m35.85858949s" podCreationTimestamp="2026-04-21 14:25:28 +0000 UTC" firstStartedPulling="2026-04-21 14:30:02.669505929 +0000 UTC m=+282.218732333" lastFinishedPulling="2026-04-21 14:30:03.679824811 +0000 UTC m=+283.229051222" observedRunningTime="2026-04-21 14:30:03.857616416 +0000 UTC m=+283.406842836" watchObservedRunningTime="2026-04-21 14:30:03.85858949 +0000 UTC m=+283.407815910" Apr 21 14:30:05.852293 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:05.852252 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9pxm9" event={"ID":"33175606-6499-415b-b273-193922870d52","Type":"ContainerStarted","Data":"bb709386444a4e6b9ac10d90731180c5b408b003c38b954333c3928897648113"} Apr 21 14:30:05.868394 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:05.868346 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9pxm9" podStartSLOduration=251.444046095 podStartE2EDuration="4m13.868329169s" podCreationTimestamp="2026-04-21 14:25:52 +0000 UTC" firstStartedPulling="2026-04-21 14:30:02.763207325 +0000 UTC m=+282.312433724" lastFinishedPulling="2026-04-21 14:30:05.187490397 +0000 UTC m=+284.736716798" observedRunningTime="2026-04-21 14:30:05.867549301 +0000 UTC m=+285.416775722" watchObservedRunningTime="2026-04-21 14:30:05.868329169 +0000 UTC m=+285.417555594" Apr 21 14:30:20.886293 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:20.886264 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:30:20.887476 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:20.887452 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:30:20.889977 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:20.889962 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 14:30:50.162490 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:50.162445 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:30:50.178341 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:50.178311 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:30:51.001072 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:30:51.001041 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 14:35:20.912587 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:35:20.912556 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:35:20.913467 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:35:20.913444 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:36:23.028067 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.028028 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fwwwz"] Apr 21 14:36:23.031368 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.031344 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.034466 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.034432 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 14:36:23.035150 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.035129 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 14:36:23.035282 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.035134 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-lr4ts\"" Apr 21 14:36:23.039859 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.039832 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fwwwz"] Apr 21 14:36:23.156470 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.156434 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.156677 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.156583 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7hz\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-kube-api-access-wk7hz\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.257955 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.257898 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.258178 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.258015 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7hz\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-kube-api-access-wk7hz\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.266633 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.266601 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.266823 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.266670 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7hz\" (UniqueName: \"kubernetes.io/projected/ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a-kube-api-access-wk7hz\") pod \"cert-manager-cainjector-68b757865b-fwwwz\" (UID: \"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.349336 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.349299 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" Apr 21 14:36:23.475568 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.475538 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fwwwz"] Apr 21 14:36:23.477899 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:36:23.477867 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad685e4f_8fdb_49d9_aeb5_9d6e1e7f649a.slice/crio-cbd091a925acfbe951a79f5c6225b1c9885fb0a0c53956869bb819ee75e3f5fd WatchSource:0}: Error finding container cbd091a925acfbe951a79f5c6225b1c9885fb0a0c53956869bb819ee75e3f5fd: Status 404 returned error can't find the container with id cbd091a925acfbe951a79f5c6225b1c9885fb0a0c53956869bb819ee75e3f5fd Apr 21 14:36:23.479777 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.479759 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:36:23.979979 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:23.979939 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" event={"ID":"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a","Type":"ContainerStarted","Data":"cbd091a925acfbe951a79f5c6225b1c9885fb0a0c53956869bb819ee75e3f5fd"} Apr 21 14:36:30.001479 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:30.001437 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" event={"ID":"ad685e4f-8fdb-49d9-aeb5-9d6e1e7f649a","Type":"ContainerStarted","Data":"56205eb7d6adc458bd1f8996948183496696ac2c735eef71b13ce659623137fe"} Apr 21 14:36:30.037841 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:30.037772 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-fwwwz" podStartSLOduration=0.649421213 podStartE2EDuration="7.037755222s" podCreationTimestamp="2026-04-21 14:36:23 +0000 UTC" firstStartedPulling="2026-04-21 14:36:23.479887524 +0000 UTC m=+663.029113926" lastFinishedPulling="2026-04-21 14:36:29.868221533 +0000 UTC m=+669.417447935" observedRunningTime="2026-04-21 14:36:30.03734921 +0000 UTC m=+669.586575632" watchObservedRunningTime="2026-04-21 14:36:30.037755222 +0000 UTC m=+669.586981637" Apr 21 14:36:44.809609 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.809564 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw"] Apr 21 14:36:44.812910 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.812893 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.816925 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.816899 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:36:44.817844 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.817816 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 14:36:44.817982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.817816 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-tzj2r\"" Apr 21 14:36:44.817982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.817822 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 14:36:44.817982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.817925 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 14:36:44.817982 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.817823 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 14:36:44.827635 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.827609 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw"] Apr 21 14:36:44.846130 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.846095 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.846290 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.846137 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qm52\" (UniqueName: \"kubernetes.io/projected/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-kube-api-access-6qm52\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.846290 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.846185 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-metrics-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.846371 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.846319 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-manager-config\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.947027 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.946982 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-manager-config\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.947027 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.947031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.947282 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.947063 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qm52\" (UniqueName: \"kubernetes.io/projected/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-kube-api-access-6qm52\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.947282 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.947086 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-metrics-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.947761 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.947718 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-manager-config\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.949712 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.949686 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-metrics-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.949853 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.949786 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-cert\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:44.956426 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:44.956398 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qm52\" (UniqueName: \"kubernetes.io/projected/71c52ede-73a2-45ff-829c-5fc6a8f4abaa-kube-api-access-6qm52\") pod \"lws-controller-manager-55dd4758fb-ng8rw\" (UID: \"71c52ede-73a2-45ff-829c-5fc6a8f4abaa\") " pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:45.123075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:45.123040 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:45.253367 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:45.253330 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw"] Apr 21 14:36:45.255303 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:36:45.255265 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c52ede_73a2_45ff_829c_5fc6a8f4abaa.slice/crio-cb90e82025d360e9150a6abee75b19a30013d7d5e6c357378ec59b62ffe3cef9 WatchSource:0}: Error finding container cb90e82025d360e9150a6abee75b19a30013d7d5e6c357378ec59b62ffe3cef9: Status 404 returned error can't find the container with id cb90e82025d360e9150a6abee75b19a30013d7d5e6c357378ec59b62ffe3cef9 Apr 21 14:36:46.048867 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:46.048832 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" event={"ID":"71c52ede-73a2-45ff-829c-5fc6a8f4abaa","Type":"ContainerStarted","Data":"cb90e82025d360e9150a6abee75b19a30013d7d5e6c357378ec59b62ffe3cef9"} Apr 21 14:36:48.056860 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:48.056812 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" event={"ID":"71c52ede-73a2-45ff-829c-5fc6a8f4abaa","Type":"ContainerStarted","Data":"6fca1ad81574ee00ad3c42779ba07e68d872def76afb744942ad3b0bb42ec71e"} Apr 21 14:36:48.057265 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:48.056872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:48.077485 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:48.077435 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" podStartSLOduration=1.917378915 podStartE2EDuration="4.077417936s" podCreationTimestamp="2026-04-21 14:36:44 +0000 UTC" firstStartedPulling="2026-04-21 14:36:45.257266384 +0000 UTC m=+684.806492783" lastFinishedPulling="2026-04-21 14:36:47.417305399 +0000 UTC m=+686.966531804" observedRunningTime="2026-04-21 14:36:48.075057893 +0000 UTC m=+687.624284312" watchObservedRunningTime="2026-04-21 14:36:48.077417936 +0000 UTC m=+687.626644356" Apr 21 14:36:50.768142 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.768110 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb"] Apr 21 14:36:50.771641 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.771620 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.775541 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.775512 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 14:36:50.775681 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.775513 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-dt2r5\"" Apr 21 14:36:50.775681 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.775626 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 14:36:50.775681 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.775640 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 14:36:50.775858 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.775708 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 14:36:50.793654 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.793625 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb"] Apr 21 14:36:50.800149 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.800118 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.800293 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.800155 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6x4\" (UniqueName: \"kubernetes.io/projected/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-kube-api-access-km6x4\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.800339 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.800301 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.900941 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.900887 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.901145 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.901038 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.901145 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.901062 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km6x4\" (UniqueName: \"kubernetes.io/projected/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-kube-api-access-km6x4\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.903440 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.903414 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.903556 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.903529 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-webhook-cert\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:50.918967 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:50.918934 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6x4\" (UniqueName: \"kubernetes.io/projected/4c22033d-0f2b-48bc-b38a-a9e3da2e97ed-kube-api-access-km6x4\") pod \"opendatahub-operator-controller-manager-7df645bd74-gndhb\" (UID: \"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed\") " pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:51.083272 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:51.083234 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:51.226911 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:51.226877 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb"] Apr 21 14:36:51.230627 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:36:51.230603 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c22033d_0f2b_48bc_b38a_a9e3da2e97ed.slice/crio-f235398006c0ccbe01b047bd158f0f91cc7bce5daaedbea9b413968e72f9e776 WatchSource:0}: Error finding container f235398006c0ccbe01b047bd158f0f91cc7bce5daaedbea9b413968e72f9e776: Status 404 returned error can't find the container with id f235398006c0ccbe01b047bd158f0f91cc7bce5daaedbea9b413968e72f9e776 Apr 21 14:36:52.072347 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:52.072306 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" event={"ID":"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed","Type":"ContainerStarted","Data":"f235398006c0ccbe01b047bd158f0f91cc7bce5daaedbea9b413968e72f9e776"} Apr 21 14:36:54.082434 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:54.082398 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" event={"ID":"4c22033d-0f2b-48bc-b38a-a9e3da2e97ed","Type":"ContainerStarted","Data":"25e8a3cafc692dc05494ad9578b40b8d4f623c4c14cc3afd6c907a40b4686eb8"} Apr 21 14:36:54.082945 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:54.082506 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:36:59.062274 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:59.062243 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-55dd4758fb-ng8rw" Apr 21 14:36:59.080890 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:36:59.080835 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" podStartSLOduration=6.527545993 podStartE2EDuration="9.08081832s" podCreationTimestamp="2026-04-21 14:36:50 +0000 UTC" firstStartedPulling="2026-04-21 14:36:51.232483587 +0000 UTC m=+690.781709988" lastFinishedPulling="2026-04-21 14:36:53.785755914 +0000 UTC m=+693.334982315" observedRunningTime="2026-04-21 14:36:54.114772373 +0000 UTC m=+693.663998794" watchObservedRunningTime="2026-04-21 14:36:59.08081832 +0000 UTC m=+698.630044741" Apr 21 14:37:05.087281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:05.087199 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7df645bd74-gndhb" Apr 21 14:37:09.526485 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.526445 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52"] Apr 21 14:37:09.529800 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.529776 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.557113 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.557082 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 14:37:09.559750 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.559699 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 14:37:09.559916 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.559806 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-b2zpf\"" Apr 21 14:37:09.560069 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.560052 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 14:37:09.560142 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.560122 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 14:37:09.583445 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.583412 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52"] Apr 21 14:37:09.684924 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.684883 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e464d5c-de0b-4253-ba73-c336561e972c-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.685113 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.684967 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e464d5c-de0b-4253-ba73-c336561e972c-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.685113 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.685021 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxh8\" (UniqueName: \"kubernetes.io/projected/0e464d5c-de0b-4253-ba73-c336561e972c-kube-api-access-9vxh8\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.786480 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.786377 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e464d5c-de0b-4253-ba73-c336561e972c-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.786480 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.786423 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxh8\" (UniqueName: \"kubernetes.io/projected/0e464d5c-de0b-4253-ba73-c336561e972c-kube-api-access-9vxh8\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.786752 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.786487 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e464d5c-de0b-4253-ba73-c336561e972c-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.788856 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.788829 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e464d5c-de0b-4253-ba73-c336561e972c-tmp\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.788973 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.788902 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e464d5c-de0b-4253-ba73-c336561e972c-tls-certs\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.800310 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.800280 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxh8\" (UniqueName: \"kubernetes.io/projected/0e464d5c-de0b-4253-ba73-c336561e972c-kube-api-access-9vxh8\") pod \"kube-auth-proxy-7b77d4bb4f-5qq52\" (UID: \"0e464d5c-de0b-4253-ba73-c336561e972c\") " pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.840459 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.840418 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" Apr 21 14:37:09.969063 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:09.969024 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52"] Apr 21 14:37:09.972199 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:37:09.972170 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e464d5c_de0b_4253_ba73_c336561e972c.slice/crio-9b64abcd08abe2dd9e4c300f10e9bd7d96f070c6b42ba42102f0e9b913dda3a8 WatchSource:0}: Error finding container 9b64abcd08abe2dd9e4c300f10e9bd7d96f070c6b42ba42102f0e9b913dda3a8: Status 404 returned error can't find the container with id 9b64abcd08abe2dd9e4c300f10e9bd7d96f070c6b42ba42102f0e9b913dda3a8 Apr 21 14:37:10.137834 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:10.137796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" event={"ID":"0e464d5c-de0b-4253-ba73-c336561e972c","Type":"ContainerStarted","Data":"9b64abcd08abe2dd9e4c300f10e9bd7d96f070c6b42ba42102f0e9b913dda3a8"} Apr 21 14:37:15.165411 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:15.165354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" event={"ID":"0e464d5c-de0b-4253-ba73-c336561e972c","Type":"ContainerStarted","Data":"f3b30dcef6ee6cd07865d505fc033948f26ced9a3dcc02c127a573e88166447d"} Apr 21 14:37:15.184158 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:37:15.184094 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b77d4bb4f-5qq52" podStartSLOduration=1.7043661060000002 podStartE2EDuration="6.184075855s" podCreationTimestamp="2026-04-21 14:37:09 +0000 UTC" firstStartedPulling="2026-04-21 14:37:09.974027176 +0000 UTC m=+709.523253578" lastFinishedPulling="2026-04-21 14:37:14.453736912 +0000 UTC m=+714.002963327" observedRunningTime="2026-04-21 14:37:15.183706405 +0000 UTC m=+714.732932849" watchObservedRunningTime="2026-04-21 14:37:15.184075855 +0000 UTC m=+714.733302277" Apr 21 14:38:47.334158 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.334077 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b"] Apr 21 14:38:47.337418 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.337400 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.351557 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.351526 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 14:38:47.351718 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.351572 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 14:38:47.352401 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.352379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-h9qp9\"" Apr 21 14:38:47.352401 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.352393 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 14:38:47.352556 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.352411 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 14:38:47.369069 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.369034 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b"] Apr 21 14:38:47.477340 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.477307 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/668b82af-a1e2-45d7-a1e0-da995e8c4a90-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.477524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.477407 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6nx\" (UniqueName: \"kubernetes.io/projected/668b82af-a1e2-45d7-a1e0-da995e8c4a90-kube-api-access-gl6nx\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.477524 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.477436 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668b82af-a1e2-45d7-a1e0-da995e8c4a90-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.578942 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.578879 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/668b82af-a1e2-45d7-a1e0-da995e8c4a90-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.579132 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.578996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6nx\" (UniqueName: \"kubernetes.io/projected/668b82af-a1e2-45d7-a1e0-da995e8c4a90-kube-api-access-gl6nx\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.579132 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.579029 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668b82af-a1e2-45d7-a1e0-da995e8c4a90-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.579760 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.579715 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668b82af-a1e2-45d7-a1e0-da995e8c4a90-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.581385 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.581362 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/668b82af-a1e2-45d7-a1e0-da995e8c4a90-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.591662 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.591593 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6nx\" (UniqueName: \"kubernetes.io/projected/668b82af-a1e2-45d7-a1e0-da995e8c4a90-kube-api-access-gl6nx\") pod \"kuadrant-console-plugin-6cb54b5c86-cnv2b\" (UID: \"668b82af-a1e2-45d7-a1e0-da995e8c4a90\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.647179 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.647133 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" Apr 21 14:38:47.784298 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:47.784257 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b"] Apr 21 14:38:47.790639 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:38:47.790588 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod668b82af_a1e2_45d7_a1e0_da995e8c4a90.slice/crio-dbf97462d4075b5d34e7193b28fdd1e6b402244093fa3109b96209a731b2f168 WatchSource:0}: Error finding container dbf97462d4075b5d34e7193b28fdd1e6b402244093fa3109b96209a731b2f168: Status 404 returned error can't find the container with id dbf97462d4075b5d34e7193b28fdd1e6b402244093fa3109b96209a731b2f168 Apr 21 14:38:48.481892 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:38:48.481855 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" event={"ID":"668b82af-a1e2-45d7-a1e0-da995e8c4a90","Type":"ContainerStarted","Data":"dbf97462d4075b5d34e7193b28fdd1e6b402244093fa3109b96209a731b2f168"} Apr 21 14:39:13.578233 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:13.578199 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" event={"ID":"668b82af-a1e2-45d7-a1e0-da995e8c4a90","Type":"ContainerStarted","Data":"1abc5b51ca938356ad419d32b59ba209fd9daec2ca116748f2aff9b47de00272"} Apr 21 14:39:13.600104 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:13.600049 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cnv2b" podStartSLOduration=1.214605758 podStartE2EDuration="26.600035422s" podCreationTimestamp="2026-04-21 14:38:47 +0000 UTC" firstStartedPulling="2026-04-21 14:38:47.792106327 +0000 UTC m=+807.341332729" lastFinishedPulling="2026-04-21 14:39:13.17753599 +0000 UTC m=+832.726762393" observedRunningTime="2026-04-21 14:39:13.597544545 +0000 UTC m=+833.146770965" watchObservedRunningTime="2026-04-21 14:39:13.600035422 +0000 UTC m=+833.149261870" Apr 21 14:39:34.669606 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.669573 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:34.726343 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.726310 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:34.726513 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.726430 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.729066 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.729040 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 14:39:34.765312 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.765277 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:34.828478 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.828440 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ztdm\" (UniqueName: \"kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.828655 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.828502 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.929292 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.929195 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.929454 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.929322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ztdm\" (UniqueName: \"kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.929874 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.929850 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:34.938044 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:34.938016 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ztdm\" (UniqueName: \"kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm\") pod \"limitador-limitador-7d549b5b-jmsvs\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:35.037187 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.037151 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:35.162876 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.162845 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:35.165905 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:39:35.165873 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810705ca_6357_4752_ba9c_11c3498c732e.slice/crio-aa328e584b18f726151b2638157854cecac328f8a044a1c28b0b302775278b2f WatchSource:0}: Error finding container aa328e584b18f726151b2638157854cecac328f8a044a1c28b0b302775278b2f: Status 404 returned error can't find the container with id aa328e584b18f726151b2638157854cecac328f8a044a1c28b0b302775278b2f Apr 21 14:39:35.509382 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.509347 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:39:35.513894 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.513869 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:39:35.516586 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.516560 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-h5qxv\"" Apr 21 14:39:35.520538 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.520481 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:39:35.534030 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.533999 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lml7q\" (UniqueName: \"kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q\") pod \"authorino-7498df8756-trvww\" (UID: \"e4a686cf-9907-4fac-8b3f-00760b1600fc\") " pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:39:35.635183 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.635146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lml7q\" (UniqueName: \"kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q\") pod \"authorino-7498df8756-trvww\" (UID: \"e4a686cf-9907-4fac-8b3f-00760b1600fc\") " pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:39:35.645400 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.645372 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lml7q\" (UniqueName: \"kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q\") pod \"authorino-7498df8756-trvww\" (UID: \"e4a686cf-9907-4fac-8b3f-00760b1600fc\") " pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:39:35.651981 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.651945 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" event={"ID":"810705ca-6357-4752-ba9c-11c3498c732e","Type":"ContainerStarted","Data":"aa328e584b18f726151b2638157854cecac328f8a044a1c28b0b302775278b2f"} Apr 21 14:39:35.825048 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.825011 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:39:35.989753 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:35.989706 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:39:35.994593 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:39:35.994519 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4a686cf_9907_4fac_8b3f_00760b1600fc.slice/crio-bffe8f705b9b2c9950f8f2529c75c6bab4aa4f0ab1312fa4a45760f0da069dde WatchSource:0}: Error finding container bffe8f705b9b2c9950f8f2529c75c6bab4aa4f0ab1312fa4a45760f0da069dde: Status 404 returned error can't find the container with id bffe8f705b9b2c9950f8f2529c75c6bab4aa4f0ab1312fa4a45760f0da069dde Apr 21 14:39:36.658526 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:36.658431 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-trvww" event={"ID":"e4a686cf-9907-4fac-8b3f-00760b1600fc","Type":"ContainerStarted","Data":"bffe8f705b9b2c9950f8f2529c75c6bab4aa4f0ab1312fa4a45760f0da069dde"} Apr 21 14:39:41.677014 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:41.676979 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-trvww" event={"ID":"e4a686cf-9907-4fac-8b3f-00760b1600fc","Type":"ContainerStarted","Data":"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f"} Apr 21 14:39:41.678422 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:41.678385 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" event={"ID":"810705ca-6357-4752-ba9c-11c3498c732e","Type":"ContainerStarted","Data":"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1"} Apr 21 14:39:41.678544 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:41.678515 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:41.693747 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:41.693684 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-trvww" podStartSLOduration=1.637951717 podStartE2EDuration="6.693667328s" podCreationTimestamp="2026-04-21 14:39:35 +0000 UTC" firstStartedPulling="2026-04-21 14:39:35.996363174 +0000 UTC m=+855.545589579" lastFinishedPulling="2026-04-21 14:39:41.052078788 +0000 UTC m=+860.601305190" observedRunningTime="2026-04-21 14:39:41.692537359 +0000 UTC m=+861.241763790" watchObservedRunningTime="2026-04-21 14:39:41.693667328 +0000 UTC m=+861.242893749" Apr 21 14:39:41.715412 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:41.715356 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" podStartSLOduration=1.829755405 podStartE2EDuration="7.715341574s" podCreationTimestamp="2026-04-21 14:39:34 +0000 UTC" firstStartedPulling="2026-04-21 14:39:35.167777189 +0000 UTC m=+854.717003588" lastFinishedPulling="2026-04-21 14:39:41.053363343 +0000 UTC m=+860.602589757" observedRunningTime="2026-04-21 14:39:41.713348363 +0000 UTC m=+861.262574784" watchObservedRunningTime="2026-04-21 14:39:41.715341574 +0000 UTC m=+861.264568046" Apr 21 14:39:51.910040 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:51.910001 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:51.910566 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:51.910264 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" podUID="810705ca-6357-4752-ba9c-11c3498c732e" containerName="limitador" containerID="cri-o://883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1" gracePeriod=30 Apr 21 14:39:51.912149 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:51.912126 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:52.448237 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.448209 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:52.489784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.489690 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file\") pod \"810705ca-6357-4752-ba9c-11c3498c732e\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " Apr 21 14:39:52.489784 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.489763 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ztdm\" (UniqueName: \"kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm\") pod \"810705ca-6357-4752-ba9c-11c3498c732e\" (UID: \"810705ca-6357-4752-ba9c-11c3498c732e\") " Apr 21 14:39:52.490049 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.490026 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file" (OuterVolumeSpecName: "config-file") pod "810705ca-6357-4752-ba9c-11c3498c732e" (UID: "810705ca-6357-4752-ba9c-11c3498c732e"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:39:52.492103 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.492068 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm" (OuterVolumeSpecName: "kube-api-access-7ztdm") pod "810705ca-6357-4752-ba9c-11c3498c732e" (UID: "810705ca-6357-4752-ba9c-11c3498c732e"). InnerVolumeSpecName "kube-api-access-7ztdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:39:52.591424 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.591377 2583 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/810705ca-6357-4752-ba9c-11c3498c732e-config-file\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:39:52.591424 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.591418 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ztdm\" (UniqueName: \"kubernetes.io/projected/810705ca-6357-4752-ba9c-11c3498c732e-kube-api-access-7ztdm\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:39:52.718509 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.718475 2583 generic.go:358] "Generic (PLEG): container finished" podID="810705ca-6357-4752-ba9c-11c3498c732e" containerID="883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1" exitCode=0 Apr 21 14:39:52.718692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.718520 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" event={"ID":"810705ca-6357-4752-ba9c-11c3498c732e","Type":"ContainerDied","Data":"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1"} Apr 21 14:39:52.718692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.718537 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" Apr 21 14:39:52.718692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.718546 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-jmsvs" event={"ID":"810705ca-6357-4752-ba9c-11c3498c732e","Type":"ContainerDied","Data":"aa328e584b18f726151b2638157854cecac328f8a044a1c28b0b302775278b2f"} Apr 21 14:39:52.718692 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.718564 2583 scope.go:117] "RemoveContainer" containerID="883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1" Apr 21 14:39:52.726928 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.726907 2583 scope.go:117] "RemoveContainer" containerID="883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1" Apr 21 14:39:52.727235 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:39:52.727215 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1\": container with ID starting with 883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1 not found: ID does not exist" containerID="883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1" Apr 21 14:39:52.727303 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.727244 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1"} err="failed to get container status \"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1\": rpc error: code = NotFound desc = could not find container \"883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1\": container with ID starting with 883163adeb66bf22a27b81137c24671e6307b21bc01a773b6853173a37ec38d1 not found: ID does not exist" Apr 21 14:39:52.740120 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.740057 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:52.745909 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.745884 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-jmsvs"] Apr 21 14:39:52.929257 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:52.929217 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810705ca-6357-4752-ba9c-11c3498c732e" path="/var/lib/kubelet/pods/810705ca-6357-4752-ba9c-11c3498c732e/volumes" Apr 21 14:39:55.668936 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.668899 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-zgxnm"] Apr 21 14:39:55.669339 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.669283 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="810705ca-6357-4752-ba9c-11c3498c732e" containerName="limitador" Apr 21 14:39:55.669339 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.669295 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="810705ca-6357-4752-ba9c-11c3498c732e" containerName="limitador" Apr 21 14:39:55.669419 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.669358 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="810705ca-6357-4752-ba9c-11c3498c732e" containerName="limitador" Apr 21 14:39:55.673495 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.673475 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.677966 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.677940 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-q9q95\"" Apr 21 14:39:55.678390 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.678374 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 14:39:55.688719 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.688690 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-zgxnm"] Apr 21 14:39:55.717996 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.717965 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0a79acd-7330-4755-a213-80b24c9eafb6-data\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.718157 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.718035 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs274\" (UniqueName: \"kubernetes.io/projected/a0a79acd-7330-4755-a213-80b24c9eafb6-kube-api-access-gs274\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.818574 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.818530 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0a79acd-7330-4755-a213-80b24c9eafb6-data\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.818788 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.818609 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs274\" (UniqueName: \"kubernetes.io/projected/a0a79acd-7330-4755-a213-80b24c9eafb6-kube-api-access-gs274\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.818980 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.818959 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0a79acd-7330-4755-a213-80b24c9eafb6-data\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.836003 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.835963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs274\" (UniqueName: \"kubernetes.io/projected/a0a79acd-7330-4755-a213-80b24c9eafb6-kube-api-access-gs274\") pod \"postgres-868db5846d-zgxnm\" (UID: \"a0a79acd-7330-4755-a213-80b24c9eafb6\") " pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:55.984324 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:55.984219 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:39:56.121480 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:56.121445 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-zgxnm"] Apr 21 14:39:56.125095 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:39:56.125062 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a79acd_7330_4755_a213_80b24c9eafb6.slice/crio-ef08cda3e149b58cecdaab9ea6073778939b8bccc7b031bf927d164a1af1dcb3 WatchSource:0}: Error finding container ef08cda3e149b58cecdaab9ea6073778939b8bccc7b031bf927d164a1af1dcb3: Status 404 returned error can't find the container with id ef08cda3e149b58cecdaab9ea6073778939b8bccc7b031bf927d164a1af1dcb3 Apr 21 14:39:56.734918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:39:56.734880 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-zgxnm" event={"ID":"a0a79acd-7330-4755-a213-80b24c9eafb6","Type":"ContainerStarted","Data":"ef08cda3e149b58cecdaab9ea6073778939b8bccc7b031bf927d164a1af1dcb3"} Apr 21 14:40:04.764822 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:04.764784 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-zgxnm" event={"ID":"a0a79acd-7330-4755-a213-80b24c9eafb6","Type":"ContainerStarted","Data":"88eefb70cf29c8dd8b1fb795c02b8a55082d5e28f7aea956d78da8ae7601b122"} Apr 21 14:40:04.765312 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:04.764907 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:40:04.789778 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:04.789701 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-zgxnm" podStartSLOduration=1.766983551 podStartE2EDuration="9.789685315s" podCreationTimestamp="2026-04-21 14:39:55 +0000 UTC" firstStartedPulling="2026-04-21 14:39:56.129600424 +0000 UTC m=+875.678826826" lastFinishedPulling="2026-04-21 14:40:04.152302189 +0000 UTC m=+883.701528590" observedRunningTime="2026-04-21 14:40:04.788627819 +0000 UTC m=+884.337854237" watchObservedRunningTime="2026-04-21 14:40:04.789685315 +0000 UTC m=+884.338911735" Apr 21 14:40:10.797696 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:10.797667 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-zgxnm" Apr 21 14:40:20.763058 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.763001 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-nhgrw"] Apr 21 14:40:20.770273 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.770244 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" Apr 21 14:40:20.773608 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.773586 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 14:40:20.774314 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.774253 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 14:40:20.774314 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.774255 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-krrwz\"" Apr 21 14:40:20.777227 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.777186 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-nhgrw"] Apr 21 14:40:20.843168 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.843137 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhwcf\" (UniqueName: \"kubernetes.io/projected/9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7-kube-api-access-xhwcf\") pod \"keycloak-operator-5c4df598dd-nhgrw\" (UID: \"9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7\") " pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" Apr 21 14:40:20.938794 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.938772 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:40:20.939753 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.939716 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:40:20.943523 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.943503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhwcf\" (UniqueName: \"kubernetes.io/projected/9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7-kube-api-access-xhwcf\") pod \"keycloak-operator-5c4df598dd-nhgrw\" (UID: \"9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7\") " pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" Apr 21 14:40:20.953774 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.953753 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 14:40:20.964487 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.964466 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 14:40:20.974515 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:20.974494 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhwcf\" (UniqueName: \"kubernetes.io/projected/9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7-kube-api-access-xhwcf\") pod \"keycloak-operator-5c4df598dd-nhgrw\" (UID: \"9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7\") " pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" Apr 21 14:40:21.083918 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:21.083886 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-krrwz\"" Apr 21 14:40:21.091436 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:21.091406 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" Apr 21 14:40:21.227363 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:21.227335 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-nhgrw"] Apr 21 14:40:21.229610 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:40:21.229580 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1b8fc6_cb8f_45f2_9ac1_5d6c48dffcb7.slice/crio-cbd8c87f2f5d2995dfa436cfd21a5d458af233cd71220e47230af244a07a7463 WatchSource:0}: Error finding container cbd8c87f2f5d2995dfa436cfd21a5d458af233cd71220e47230af244a07a7463: Status 404 returned error can't find the container with id cbd8c87f2f5d2995dfa436cfd21a5d458af233cd71220e47230af244a07a7463 Apr 21 14:40:21.829548 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:21.829510 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" event={"ID":"9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7","Type":"ContainerStarted","Data":"cbd8c87f2f5d2995dfa436cfd21a5d458af233cd71220e47230af244a07a7463"} Apr 21 14:40:27.859255 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:27.859214 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" event={"ID":"9d1b8fc6-cb8f-45f2-9ac1-5d6c48dffcb7","Type":"ContainerStarted","Data":"deb455c5e2e76c8a26a48563c205c916ecdf01d3ad8d83e05c09c30c11af4bcf"} Apr 21 14:40:27.879288 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:40:27.879225 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-nhgrw" podStartSLOduration=2.277344775 podStartE2EDuration="7.879206587s" podCreationTimestamp="2026-04-21 14:40:20 +0000 UTC" firstStartedPulling="2026-04-21 14:40:21.231560756 +0000 UTC m=+900.780787158" lastFinishedPulling="2026-04-21 14:40:26.833422554 +0000 UTC m=+906.382648970" observedRunningTime="2026-04-21 14:40:27.878871909 +0000 UTC m=+907.428098331" watchObservedRunningTime="2026-04-21 14:40:27.879206587 +0000 UTC m=+907.428433009" Apr 21 14:41:08.204086 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.204043 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:41:08.204658 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.204267 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-trvww" podUID="e4a686cf-9907-4fac-8b3f-00760b1600fc" containerName="authorino" containerID="cri-o://d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f" gracePeriod=30 Apr 21 14:41:08.452549 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.452522 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:41:08.494281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.494198 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lml7q\" (UniqueName: \"kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q\") pod \"e4a686cf-9907-4fac-8b3f-00760b1600fc\" (UID: \"e4a686cf-9907-4fac-8b3f-00760b1600fc\") " Apr 21 14:41:08.496295 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.496266 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q" (OuterVolumeSpecName: "kube-api-access-lml7q") pod "e4a686cf-9907-4fac-8b3f-00760b1600fc" (UID: "e4a686cf-9907-4fac-8b3f-00760b1600fc"). InnerVolumeSpecName "kube-api-access-lml7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:41:08.595080 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.595039 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lml7q\" (UniqueName: \"kubernetes.io/projected/e4a686cf-9907-4fac-8b3f-00760b1600fc-kube-api-access-lml7q\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:41:08.998245 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.998209 2583 generic.go:358] "Generic (PLEG): container finished" podID="e4a686cf-9907-4fac-8b3f-00760b1600fc" containerID="d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f" exitCode=0 Apr 21 14:41:08.998408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.998255 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-trvww" Apr 21 14:41:08.998408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.998280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-trvww" event={"ID":"e4a686cf-9907-4fac-8b3f-00760b1600fc","Type":"ContainerDied","Data":"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f"} Apr 21 14:41:08.998408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.998321 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-trvww" event={"ID":"e4a686cf-9907-4fac-8b3f-00760b1600fc","Type":"ContainerDied","Data":"bffe8f705b9b2c9950f8f2529c75c6bab4aa4f0ab1312fa4a45760f0da069dde"} Apr 21 14:41:08.998408 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:08.998338 2583 scope.go:117] "RemoveContainer" containerID="d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f" Apr 21 14:41:09.008861 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:09.008812 2583 scope.go:117] "RemoveContainer" containerID="d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f" Apr 21 14:41:09.009304 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:41:09.009282 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f\": container with ID starting with d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f not found: ID does not exist" containerID="d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f" Apr 21 14:41:09.009389 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:09.009313 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f"} err="failed to get container status \"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f\": rpc error: code = NotFound desc = could not find container \"d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f\": container with ID starting with d04bd18d7c30b31d6c17b80e71c780cd16f29222e993b2680a96746e9f4dc09f not found: ID does not exist" Apr 21 14:41:09.052384 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:09.052347 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:41:09.056589 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:09.056555 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-trvww"] Apr 21 14:41:10.927984 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:41:10.927949 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a686cf-9907-4fac-8b3f-00760b1600fc" path="/var/lib/kubelet/pods/e4a686cf-9907-4fac-8b3f-00760b1600fc/volumes" Apr 21 14:45:00.149193 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.149144 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:45:00.149685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.149569 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4a686cf-9907-4fac-8b3f-00760b1600fc" containerName="authorino" Apr 21 14:45:00.149685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.149580 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a686cf-9907-4fac-8b3f-00760b1600fc" containerName="authorino" Apr 21 14:45:00.149685 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.149651 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4a686cf-9907-4fac-8b3f-00760b1600fc" containerName="authorino" Apr 21 14:45:00.153072 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.153035 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:45:00.154915 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.154884 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:45:00.155586 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.155555 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-h4b5p\"" Apr 21 14:45:00.218467 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.218407 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgsc\" (UniqueName: \"kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc\") pod \"maas-api-key-cleanup-29613045-t44hj\" (UID: \"c73326da-7316-4aa9-acfe-a5532c7f810a\") " pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:45:00.319665 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.319622 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgsc\" (UniqueName: \"kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc\") pod \"maas-api-key-cleanup-29613045-t44hj\" (UID: \"c73326da-7316-4aa9-acfe-a5532c7f810a\") " pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:45:00.330252 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.330210 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgsc\" (UniqueName: \"kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc\") pod \"maas-api-key-cleanup-29613045-t44hj\" (UID: \"c73326da-7316-4aa9-acfe-a5532c7f810a\") " pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:45:00.465046 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.464922 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:45:00.599612 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.599562 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:45:00.602412 ip-10-0-138-93 kubenswrapper[2583]: W0421 14:45:00.602371 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73326da_7316_4aa9_acfe_a5532c7f810a.slice/crio-fdf329d87309de6282d4b2fdeb5982a0fe00fbd48c5e4fce07887a2e0d1ee9c5 WatchSource:0}: Error finding container fdf329d87309de6282d4b2fdeb5982a0fe00fbd48c5e4fce07887a2e0d1ee9c5: Status 404 returned error can't find the container with id fdf329d87309de6282d4b2fdeb5982a0fe00fbd48c5e4fce07887a2e0d1ee9c5 Apr 21 14:45:00.604075 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.604058 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:45:00.810281 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:00.810195 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerStarted","Data":"fdf329d87309de6282d4b2fdeb5982a0fe00fbd48c5e4fce07887a2e0d1ee9c5"} Apr 21 14:45:03.821782 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:03.821718 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerStarted","Data":"360b8c83320aab6e796a745862765b9c1359d96030f7761c72c1196424eae133"} Apr 21 14:45:03.839573 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:03.839510 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" podStartSLOduration=0.929269631 podStartE2EDuration="3.839490588s" podCreationTimestamp="2026-04-21 14:45:00 +0000 UTC" firstStartedPulling="2026-04-21 14:45:00.604189385 +0000 UTC m=+1180.153415784" lastFinishedPulling="2026-04-21 14:45:03.514410338 +0000 UTC m=+1183.063636741" observedRunningTime="2026-04-21 14:45:03.836411315 +0000 UTC m=+1183.385637735" watchObservedRunningTime="2026-04-21 14:45:03.839490588 +0000 UTC m=+1183.388717010" Apr 21 14:45:20.967004 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:20.966975 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:45:20.967445 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:20.967251 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:45:24.899902 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:24.899863 2583 generic.go:358] "Generic (PLEG): container finished" podID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerID="360b8c83320aab6e796a745862765b9c1359d96030f7761c72c1196424eae133" exitCode=6 Apr 21 14:45:24.900287 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:24.899911 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerDied","Data":"360b8c83320aab6e796a745862765b9c1359d96030f7761c72c1196424eae133"} Apr 21 14:45:24.900287 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:24.900247 2583 scope.go:117] "RemoveContainer" containerID="360b8c83320aab6e796a745862765b9c1359d96030f7761c72c1196424eae133" Apr 21 14:45:25.904740 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:25.904689 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerStarted","Data":"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a"} Apr 21 14:45:45.972792 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:45.972749 2583 generic.go:358] "Generic (PLEG): container finished" podID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" exitCode=6 Apr 21 14:45:45.973315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:45.972821 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerDied","Data":"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a"} Apr 21 14:45:45.973315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:45.972867 2583 scope.go:117] "RemoveContainer" containerID="360b8c83320aab6e796a745862765b9c1359d96030f7761c72c1196424eae133" Apr 21 14:45:45.973315 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:45.973261 2583 scope.go:117] "RemoveContainer" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" Apr 21 14:45:45.973531 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:45:45.973510 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613045-t44hj_opendatahub(c73326da-7316-4aa9-acfe-a5532c7f810a)\"" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" Apr 21 14:45:57.923981 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:57.923945 2583 scope.go:117] "RemoveContainer" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" Apr 21 14:45:59.022027 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:45:59.021986 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerStarted","Data":"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac"} Apr 21 14:46:00.011513 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:00.011460 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:46:00.025779 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:00.025717 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" containerID="cri-o://fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac" gracePeriod=30 Apr 21 14:46:18.764068 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:18.764033 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:46:18.819041 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:18.819011 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzgsc\" (UniqueName: \"kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc\") pod \"c73326da-7316-4aa9-acfe-a5532c7f810a\" (UID: \"c73326da-7316-4aa9-acfe-a5532c7f810a\") " Apr 21 14:46:18.821234 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:18.821205 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc" (OuterVolumeSpecName: "kube-api-access-bzgsc") pod "c73326da-7316-4aa9-acfe-a5532c7f810a" (UID: "c73326da-7316-4aa9-acfe-a5532c7f810a"). InnerVolumeSpecName "kube-api-access-bzgsc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:46:18.919629 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:18.919590 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzgsc\" (UniqueName: \"kubernetes.io/projected/c73326da-7316-4aa9-acfe-a5532c7f810a-kube-api-access-bzgsc\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 14:46:19.091127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.091045 2583 generic.go:358] "Generic (PLEG): container finished" podID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerID="fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac" exitCode=6 Apr 21 14:46:19.091127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.091086 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerDied","Data":"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac"} Apr 21 14:46:19.091127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.091109 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" event={"ID":"c73326da-7316-4aa9-acfe-a5532c7f810a","Type":"ContainerDied","Data":"fdf329d87309de6282d4b2fdeb5982a0fe00fbd48c5e4fce07887a2e0d1ee9c5"} Apr 21 14:46:19.091127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.091110 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613045-t44hj" Apr 21 14:46:19.091127 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.091126 2583 scope.go:117] "RemoveContainer" containerID="fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac" Apr 21 14:46:19.100540 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.100517 2583 scope.go:117] "RemoveContainer" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" Apr 21 14:46:19.108239 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.108143 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:46:19.108307 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.108241 2583 scope.go:117] "RemoveContainer" containerID="fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac" Apr 21 14:46:19.108984 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:46:19.108656 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac\": container with ID starting with fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac not found: ID does not exist" containerID="fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac" Apr 21 14:46:19.108984 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.108695 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac"} err="failed to get container status \"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac\": rpc error: code = NotFound desc = could not find container \"fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac\": container with ID starting with fab9691caa8a40e7dd370b8191eef92a1b7e6c022c431f5e7da8a969b59c6bac not found: ID does not exist" Apr 21 14:46:19.108984 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.108742 2583 scope.go:117] "RemoveContainer" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" Apr 21 14:46:19.109193 ip-10-0-138-93 kubenswrapper[2583]: E0421 14:46:19.109119 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a\": container with ID starting with d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a not found: ID does not exist" containerID="d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a" Apr 21 14:46:19.109193 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.109145 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a"} err="failed to get container status \"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a\": rpc error: code = NotFound desc = could not find container \"d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a\": container with ID starting with d82349dd8ad8552c25d902f94721cd54906459ad38ae5d5646ed362b3f54fe4a not found: ID does not exist" Apr 21 14:46:19.110488 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:19.110467 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613045-t44hj"] Apr 21 14:46:20.928010 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:46:20.927975 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" path="/var/lib/kubelet/pods/c73326da-7316-4aa9-acfe-a5532c7f810a/volumes" Apr 21 14:50:20.993325 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:50:20.993296 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:50:20.994062 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:50:20.994040 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:55:21.019743 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:55:21.019637 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 14:55:21.021786 ip-10-0-138-93 kubenswrapper[2583]: I0421 14:55:21.021764 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:00:00.138578 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.138531 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:00:00.139212 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139117 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.139212 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139139 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.139212 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139159 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.139212 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139170 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.139425 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139270 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.139425 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.139290 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:00:00.142128 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.142101 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:00:00.145046 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.145024 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-h4b5p\"" Apr 21 15:00:00.159638 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.159602 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:00:00.172911 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.172876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvvf2\" (UniqueName: \"kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2\") pod \"maas-api-key-cleanup-29613060-xs8ks\" (UID: \"20cc1393-26b5-43e4-8759-1652d00ddacc\") " pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:00:00.274271 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.274225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvvf2\" (UniqueName: \"kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2\") pod \"maas-api-key-cleanup-29613060-xs8ks\" (UID: \"20cc1393-26b5-43e4-8759-1652d00ddacc\") " pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:00:00.282659 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.282622 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvvf2\" (UniqueName: \"kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2\") pod \"maas-api-key-cleanup-29613060-xs8ks\" (UID: \"20cc1393-26b5-43e4-8759-1652d00ddacc\") " pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:00:00.452868 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.452772 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:00:00.579997 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.579961 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:00:00.583089 ip-10-0-138-93 kubenswrapper[2583]: W0421 15:00:00.583059 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cc1393_26b5_43e4_8759_1652d00ddacc.slice/crio-d51855883fa679ca9b2ae641eac9bd2988c0e08c747cac9ccf66f954ad1d935a WatchSource:0}: Error finding container d51855883fa679ca9b2ae641eac9bd2988c0e08c747cac9ccf66f954ad1d935a: Status 404 returned error can't find the container with id d51855883fa679ca9b2ae641eac9bd2988c0e08c747cac9ccf66f954ad1d935a Apr 21 15:00:00.587619 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.587599 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:00:00.958639 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.958601 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerStarted","Data":"6c1a05aa9d5202f27762d754955d6da0f9a38b92fd0e353499ab3011de0ee91f"} Apr 21 15:00:00.958849 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.958647 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerStarted","Data":"d51855883fa679ca9b2ae641eac9bd2988c0e08c747cac9ccf66f954ad1d935a"} Apr 21 15:00:00.975604 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:00.975535 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" podStartSLOduration=0.975518351 podStartE2EDuration="975.518351ms" podCreationTimestamp="2026-04-21 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:00:00.974163967 +0000 UTC m=+2080.523390387" watchObservedRunningTime="2026-04-21 15:00:00.975518351 +0000 UTC m=+2080.524744809" Apr 21 15:00:21.045922 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:21.045878 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:00:21.048088 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:21.048014 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:00:22.030428 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:22.030389 2583 generic.go:358] "Generic (PLEG): container finished" podID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerID="6c1a05aa9d5202f27762d754955d6da0f9a38b92fd0e353499ab3011de0ee91f" exitCode=6 Apr 21 15:00:22.030603 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:22.030461 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerDied","Data":"6c1a05aa9d5202f27762d754955d6da0f9a38b92fd0e353499ab3011de0ee91f"} Apr 21 15:00:22.030851 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:22.030830 2583 scope.go:117] "RemoveContainer" containerID="6c1a05aa9d5202f27762d754955d6da0f9a38b92fd0e353499ab3011de0ee91f" Apr 21 15:00:23.035581 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:23.035546 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerStarted","Data":"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77"} Apr 21 15:00:43.113142 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:43.113105 2583 generic.go:358] "Generic (PLEG): container finished" podID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" exitCode=6 Apr 21 15:00:43.113583 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:43.113183 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerDied","Data":"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77"} Apr 21 15:00:43.113583 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:43.113230 2583 scope.go:117] "RemoveContainer" containerID="6c1a05aa9d5202f27762d754955d6da0f9a38b92fd0e353499ab3011de0ee91f" Apr 21 15:00:43.113583 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:43.113537 2583 scope.go:117] "RemoveContainer" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" Apr 21 15:00:43.113826 ip-10-0-138-93 kubenswrapper[2583]: E0421 15:00:43.113806 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613060-xs8ks_opendatahub(20cc1393-26b5-43e4-8759-1652d00ddacc)\"" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" Apr 21 15:00:56.924216 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:56.924172 2583 scope.go:117] "RemoveContainer" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" Apr 21 15:00:58.167579 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:58.167534 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerStarted","Data":"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797"} Apr 21 15:00:59.195915 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:59.195878 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:00:59.196394 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:00:59.196086 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" containerID="cri-o://10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797" gracePeriod=30 Apr 21 15:01:17.841364 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:17.841334 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:01:17.894086 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:17.894044 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvvf2\" (UniqueName: \"kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2\") pod \"20cc1393-26b5-43e4-8759-1652d00ddacc\" (UID: \"20cc1393-26b5-43e4-8759-1652d00ddacc\") " Apr 21 15:01:17.896135 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:17.896107 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2" (OuterVolumeSpecName: "kube-api-access-fvvf2") pod "20cc1393-26b5-43e4-8759-1652d00ddacc" (UID: "20cc1393-26b5-43e4-8759-1652d00ddacc"). InnerVolumeSpecName "kube-api-access-fvvf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:01:17.994911 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:17.994818 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvvf2\" (UniqueName: \"kubernetes.io/projected/20cc1393-26b5-43e4-8759-1652d00ddacc-kube-api-access-fvvf2\") on node \"ip-10-0-138-93.ec2.internal\" DevicePath \"\"" Apr 21 15:01:18.239183 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.239146 2583 generic.go:358] "Generic (PLEG): container finished" podID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerID="10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797" exitCode=6 Apr 21 15:01:18.239415 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.239209 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" Apr 21 15:01:18.239415 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.239232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerDied","Data":"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797"} Apr 21 15:01:18.239415 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.239281 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613060-xs8ks" event={"ID":"20cc1393-26b5-43e4-8759-1652d00ddacc","Type":"ContainerDied","Data":"d51855883fa679ca9b2ae641eac9bd2988c0e08c747cac9ccf66f954ad1d935a"} Apr 21 15:01:18.239415 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.239301 2583 scope.go:117] "RemoveContainer" containerID="10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797" Apr 21 15:01:18.248273 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.248251 2583 scope.go:117] "RemoveContainer" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" Apr 21 15:01:18.255859 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.255839 2583 scope.go:117] "RemoveContainer" containerID="10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797" Apr 21 15:01:18.256105 ip-10-0-138-93 kubenswrapper[2583]: E0421 15:01:18.256087 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797\": container with ID starting with 10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797 not found: ID does not exist" containerID="10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797" Apr 21 15:01:18.256146 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.256116 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797"} err="failed to get container status \"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797\": rpc error: code = NotFound desc = could not find container \"10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797\": container with ID starting with 10ea0fa9a59e3a57134a89a0f550cc18a4bdeb9d0f018afbae865ffe43257797 not found: ID does not exist" Apr 21 15:01:18.256146 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.256136 2583 scope.go:117] "RemoveContainer" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" Apr 21 15:01:18.256384 ip-10-0-138-93 kubenswrapper[2583]: E0421 15:01:18.256361 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77\": container with ID starting with c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77 not found: ID does not exist" containerID="c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77" Apr 21 15:01:18.256456 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.256395 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77"} err="failed to get container status \"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77\": rpc error: code = NotFound desc = could not find container \"c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77\": container with ID starting with c219bac8a185b63b2b865e73324d4f487d580ad85f075dea803b18e32e406f77 not found: ID does not exist" Apr 21 15:01:18.260672 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.260646 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:01:18.265579 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.265555 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613060-xs8ks"] Apr 21 15:01:18.928861 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:01:18.928820 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" path="/var/lib/kubelet/pods/20cc1393-26b5-43e4-8759-1652d00ddacc/volumes" Apr 21 15:05:21.074893 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:05:21.074862 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:05:21.077981 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:05:21.077962 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:10:21.100111 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:21.099998 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:10:21.106187 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:21.106166 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pcn9k_e3bb98dc-f964-4937-9c95-4899ca412b4a/ovn-acl-logging/0.log" Apr 21 15:10:25.114670 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:25.114637 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7df645bd74-gndhb_4c22033d-0f2b-48bc-b38a-a9e3da2e97ed/manager/0.log" Apr 21 15:10:25.367035 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:25.366941 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-zgxnm_a0a79acd-7330-4755-a213-80b24c9eafb6/postgres/0.log" Apr 21 15:10:26.916738 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:26.916696 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cnv2b_668b82af-a1e2-45d7-a1e0-da995e8c4a90/kuadrant-console-plugin/0.log" Apr 21 15:10:28.051303 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:28.051268 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b77d4bb4f-5qq52_0e464d5c-de0b-4253-ba73-c336561e972c/kube-auth-proxy/0.log" Apr 21 15:10:33.195083 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195039 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qvxwt/must-gather-9mpx5"] Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195402 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195413 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195426 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195431 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195455 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:10:33.195494 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195461 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195509 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c73326da-7316-4aa9-acfe-a5532c7f810a" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195519 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195529 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195535 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195601 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.195703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.195607 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cc1393-26b5-43e4-8759-1652d00ddacc" containerName="cleanup" Apr 21 15:10:33.198878 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.198856 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.201659 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.201637 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qvxwt\"/\"default-dockercfg-v77v4\"" Apr 21 15:10:33.202618 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.202600 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qvxwt\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.202712 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.202648 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qvxwt\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.223185 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.223154 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/must-gather-9mpx5"] Apr 21 15:10:33.266440 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.266402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpmql\" (UniqueName: \"kubernetes.io/projected/06551abb-dac4-4fbd-8779-c731789d3ecf-kube-api-access-qpmql\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.266648 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.266450 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/06551abb-dac4-4fbd-8779-c731789d3ecf-must-gather-output\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.367986 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.367942 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpmql\" (UniqueName: \"kubernetes.io/projected/06551abb-dac4-4fbd-8779-c731789d3ecf-kube-api-access-qpmql\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.368191 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.368001 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/06551abb-dac4-4fbd-8779-c731789d3ecf-must-gather-output\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.368385 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.368362 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/06551abb-dac4-4fbd-8779-c731789d3ecf-must-gather-output\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.384439 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.384402 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpmql\" (UniqueName: \"kubernetes.io/projected/06551abb-dac4-4fbd-8779-c731789d3ecf-kube-api-access-qpmql\") pod \"must-gather-9mpx5\" (UID: \"06551abb-dac4-4fbd-8779-c731789d3ecf\") " pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.507775 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.507655 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" Apr 21 15:10:33.636568 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.636541 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/must-gather-9mpx5"] Apr 21 15:10:33.638961 ip-10-0-138-93 kubenswrapper[2583]: W0421 15:10:33.638927 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06551abb_dac4_4fbd_8779_c731789d3ecf.slice/crio-ec5d0d01f432b7a1e63b308cb0efb414c75133373ff3e36ed502cdba0c3fd9ae WatchSource:0}: Error finding container ec5d0d01f432b7a1e63b308cb0efb414c75133373ff3e36ed502cdba0c3fd9ae: Status 404 returned error can't find the container with id ec5d0d01f432b7a1e63b308cb0efb414c75133373ff3e36ed502cdba0c3fd9ae Apr 21 15:10:33.640691 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:33.640668 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:10:34.175380 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:34.175342 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" event={"ID":"06551abb-dac4-4fbd-8779-c731789d3ecf","Type":"ContainerStarted","Data":"ec5d0d01f432b7a1e63b308cb0efb414c75133373ff3e36ed502cdba0c3fd9ae"} Apr 21 15:10:35.182073 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:35.181961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" event={"ID":"06551abb-dac4-4fbd-8779-c731789d3ecf","Type":"ContainerStarted","Data":"853535e79164fffb64df1cbd4973dca2b692e9a62181ecb221c756ee76b1414f"} Apr 21 15:10:35.182073 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:35.182008 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" event={"ID":"06551abb-dac4-4fbd-8779-c731789d3ecf","Type":"ContainerStarted","Data":"277613a025108b3b801c1987df31650be7bffb388099ebf92050ac3aad8b1835"} Apr 21 15:10:35.205527 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:35.205460 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qvxwt/must-gather-9mpx5" podStartSLOduration=1.255027666 podStartE2EDuration="2.205437411s" podCreationTimestamp="2026-04-21 15:10:33 +0000 UTC" firstStartedPulling="2026-04-21 15:10:33.640893853 +0000 UTC m=+2713.190120266" lastFinishedPulling="2026-04-21 15:10:34.591303609 +0000 UTC m=+2714.140530011" observedRunningTime="2026-04-21 15:10:35.202618694 +0000 UTC m=+2714.751845324" watchObservedRunningTime="2026-04-21 15:10:35.205437411 +0000 UTC m=+2714.754663832" Apr 21 15:10:36.452403 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:36.452352 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-66rqj_fddef6cc-f623-494d-af92-4fd8811401a4/global-pull-secret-syncer/0.log" Apr 21 15:10:36.647945 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:36.647896 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jqh94_a974b412-8de8-4e89-965b-a82f6e82ccf8/konnectivity-agent/0.log" Apr 21 15:10:36.735130 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:36.735026 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-93.ec2.internal_1f5719e4ae571eebc033266ca01f65bf/haproxy/0.log" Apr 21 15:10:41.695635 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:41.695595 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cnv2b_668b82af-a1e2-45d7-a1e0-da995e8c4a90/kuadrant-console-plugin/0.log" Apr 21 15:10:43.379648 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.379560 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/alertmanager/0.log" Apr 21 15:10:43.428200 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.428124 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/config-reloader/0.log" Apr 21 15:10:43.461553 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.461516 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/kube-rbac-proxy-web/0.log" Apr 21 15:10:43.490651 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.490609 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/kube-rbac-proxy/0.log" Apr 21 15:10:43.537564 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.537529 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/kube-rbac-proxy-metric/0.log" Apr 21 15:10:43.579746 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.579691 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/prom-label-proxy/0.log" Apr 21 15:10:43.618435 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.618401 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ab2c0316-53f9-4129-9cc1-a7970a962cb9/init-config-reloader/0.log" Apr 21 15:10:43.865822 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.865785 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67ffd846c7-zsv56_cf8c72e3-8e7f-482a-817d-7b5e2fc81ab5/metrics-server/0.log" Apr 21 15:10:43.966211 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:43.966174 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9bgw4_0804186e-8a55-4aa4-9a48-ec393ed70e24/node-exporter/0.log" Apr 21 15:10:44.010902 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.010870 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9bgw4_0804186e-8a55-4aa4-9a48-ec393ed70e24/kube-rbac-proxy/0.log" Apr 21 15:10:44.056401 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.056370 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9bgw4_0804186e-8a55-4aa4-9a48-ec393ed70e24/init-textfile/0.log" Apr 21 15:10:44.484260 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.484222 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/prometheus/0.log" Apr 21 15:10:44.518718 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.518687 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/config-reloader/0.log" Apr 21 15:10:44.548596 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.548563 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/thanos-sidecar/0.log" Apr 21 15:10:44.574084 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.574046 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/kube-rbac-proxy-web/0.log" Apr 21 15:10:44.598932 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.598903 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/kube-rbac-proxy/0.log" Apr 21 15:10:44.630176 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.630148 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/kube-rbac-proxy-thanos/0.log" Apr 21 15:10:44.660962 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.660929 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7ce890dd-192a-44d8-9ab9-7597bb0d75a5/init-config-reloader/0.log" Apr 21 15:10:44.698070 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.698041 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gnbgt_10e06e54-27b1-40e3-868d-0417cbbfed4c/prometheus-operator/0.log" Apr 21 15:10:44.721587 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.721550 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gnbgt_10e06e54-27b1-40e3-868d-0417cbbfed4c/kube-rbac-proxy/0.log" Apr 21 15:10:44.754432 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.754347 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xfvsw_b6a7d3a3-8dbf-493f-99dd-7922d8495302/prometheus-operator-admission-webhook/0.log" Apr 21 15:10:44.787573 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.787541 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-86c4c66c-rfkzl_ca1cc24d-4246-470d-b19e-e1e3c38e3d5f/telemeter-client/0.log" Apr 21 15:10:44.827260 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.827227 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-86c4c66c-rfkzl_ca1cc24d-4246-470d-b19e-e1e3c38e3d5f/reload/0.log" Apr 21 15:10:44.861867 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.861837 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-86c4c66c-rfkzl_ca1cc24d-4246-470d-b19e-e1e3c38e3d5f/kube-rbac-proxy/0.log" Apr 21 15:10:44.894772 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.894718 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/thanos-query/0.log" Apr 21 15:10:44.921449 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.921414 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/kube-rbac-proxy-web/0.log" Apr 21 15:10:44.960683 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.960630 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/kube-rbac-proxy/0.log" Apr 21 15:10:44.989720 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:44.989695 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/prom-label-proxy/0.log" Apr 21 15:10:45.015137 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.015036 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/kube-rbac-proxy-rules/0.log" Apr 21 15:10:45.037949 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.037916 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-db764b9c9-q4t6s_a6cb3be0-b8f9-40f2-8ce9-8d112541cfb5/kube-rbac-proxy-metrics/0.log" Apr 21 15:10:45.409556 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.409510 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn"] Apr 21 15:10:45.414529 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.414498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.424828 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.424758 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn"] Apr 21 15:10:45.496551 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.496515 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-sys\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.497233 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.497205 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-podres\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.497404 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.497388 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-lib-modules\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.497530 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.497518 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-proc\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.497703 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.497688 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntf6g\" (UniqueName: \"kubernetes.io/projected/12be3802-250e-4b90-9084-11d5a8266439-kube-api-access-ntf6g\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599149 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599100 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-podres\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599344 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599247 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-lib-modules\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599344 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599267 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-podres\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599344 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599287 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-proc\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599510 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599347 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-proc\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599510 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599379 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntf6g\" (UniqueName: \"kubernetes.io/projected/12be3802-250e-4b90-9084-11d5a8266439-kube-api-access-ntf6g\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599510 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-lib-modules\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599510 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599405 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-sys\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.599510 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.599438 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12be3802-250e-4b90-9084-11d5a8266439-sys\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.609064 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.609032 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntf6g\" (UniqueName: \"kubernetes.io/projected/12be3802-250e-4b90-9084-11d5a8266439-kube-api-access-ntf6g\") pod \"perf-node-gather-daemonset-s2kcn\" (UID: \"12be3802-250e-4b90-9084-11d5a8266439\") " pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.727409 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.727313 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:45.884776 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:45.884746 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn"] Apr 21 15:10:46.120357 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:46.120311 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-bcjt5_52f4ef6e-0001-42d5-acda-d8d1b7ce4e20/networking-console-plugin/0.log" Apr 21 15:10:46.243770 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:46.243702 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" event={"ID":"12be3802-250e-4b90-9084-11d5a8266439","Type":"ContainerStarted","Data":"acaeb1769c9bae91a03f2cd60be0b4527bb042b4dee5b6aa235d728d24633698"} Apr 21 15:10:46.243944 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:46.243774 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" event={"ID":"12be3802-250e-4b90-9084-11d5a8266439","Type":"ContainerStarted","Data":"583ed607f81cb4f94caf29bf43767a3c7e1fbce74baff231658bb34f3eca4cf4"} Apr 21 15:10:46.265520 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:46.265466 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" podStartSLOduration=1.265446838 podStartE2EDuration="1.265446838s" podCreationTimestamp="2026-04-21 15:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:46.264110537 +0000 UTC m=+2725.813336958" watchObservedRunningTime="2026-04-21 15:10:46.265446838 +0000 UTC m=+2725.814673261" Apr 21 15:10:47.247287 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:47.247252 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:48.619882 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:48.619849 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-flqfn_1e1883f5-d573-490b-964d-821444217181/dns/0.log" Apr 21 15:10:48.640484 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:48.640456 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-flqfn_1e1883f5-d573-490b-964d-821444217181/kube-rbac-proxy/0.log" Apr 21 15:10:48.742468 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:48.742440 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fr8hk_0e814615-0b94-4de9-9bfd-1b36c817910a/dns-node-resolver/0.log" Apr 21 15:10:49.317453 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:49.317420 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2gtk9_9cd06c0c-27bc-443c-a928-76a45a2b2514/node-ca/0.log" Apr 21 15:10:50.675838 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:50.675805 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b77d4bb4f-5qq52_0e464d5c-de0b-4253-ba73-c336561e972c/kube-auth-proxy/0.log" Apr 21 15:10:51.323389 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:51.323356 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9pxm9_33175606-6499-415b-b273-193922870d52/serve-healthcheck-canary/0.log" Apr 21 15:10:51.923548 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:51.923514 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c67k6_37d8c7f0-2689-4181-9759-32285450db5e/kube-rbac-proxy/0.log" Apr 21 15:10:51.945108 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:51.945071 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c67k6_37d8c7f0-2689-4181-9759-32285450db5e/exporter/0.log" Apr 21 15:10:51.967388 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:51.967350 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c67k6_37d8c7f0-2689-4181-9759-32285450db5e/extractor/0.log" Apr 21 15:10:53.262548 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:53.262519 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qvxwt/perf-node-gather-daemonset-s2kcn" Apr 21 15:10:54.426125 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:54.426086 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7df645bd74-gndhb_4c22033d-0f2b-48bc-b38a-a9e3da2e97ed/manager/0.log" Apr 21 15:10:54.558301 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:54.558264 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-zgxnm_a0a79acd-7330-4755-a213-80b24c9eafb6/postgres/0.log" Apr 21 15:10:56.138402 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:10:56.138359 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-55dd4758fb-ng8rw_71c52ede-73a2-45ff-829c-5fc6a8f4abaa/manager/0.log" Apr 21 15:11:00.673508 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:00.673478 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zdnvg_9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2/migrator/0.log" Apr 21 15:11:00.696543 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:00.696513 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zdnvg_9e8477fc-9b40-4b7b-8db7-edfd1d2e6ee2/graceful-termination/0.log" Apr 21 15:11:02.134203 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.134174 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/kube-multus-additional-cni-plugins/0.log" Apr 21 15:11:02.172777 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.172718 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/egress-router-binary-copy/0.log" Apr 21 15:11:02.213461 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.213429 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/cni-plugins/0.log" Apr 21 15:11:02.241915 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.241886 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/bond-cni-plugin/0.log" Apr 21 15:11:02.276698 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.276664 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/routeoverride-cni/0.log" Apr 21 15:11:02.312537 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.312506 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/whereabouts-cni-bincopy/0.log" Apr 21 15:11:02.342721 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.342689 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88f5f_71044697-c141-4bb8-a13d-2e24d233501f/whereabouts-cni/0.log" Apr 21 15:11:02.749109 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.749075 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fg4r9_ff78599b-b18f-47bf-83ba-ffa70116ffdd/kube-multus/0.log" Apr 21 15:11:02.856238 ip-10-0-138-93 kubenswrapper[2583]: I0421 15:11:02.856203 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcph6_425eadc2-ce6c-4aeb-9856-41d3b15c076b/network-metrics-daemon/0.log"