Apr 16 22:02:07.598669 ip-10-0-138-154 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:02:07.598681 ip-10-0-138-154 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:02:07.598690 ip-10-0-138-154 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:02:07.598904 ip-10-0-138-154 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:02:17.812917 ip-10-0-138-154 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:02:17.812933 ip-10-0-138-154 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fadeb04af3f34ddc9418d2ce61526af6 -- Apr 16 22:04:54.295265 ip-10-0-138-154 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:04:54.765587 ip-10-0-138-154 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:54.765587 ip-10-0-138-154 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:04:54.765587 ip-10-0-138-154 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:54.765587 ip-10-0-138-154 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:04:54.765587 ip-10-0-138-154 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:54.766512 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.766423 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:04:54.768678 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768657 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:54.768678 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768673 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:54.768678 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768678 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:54.768678 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768683 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768689 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768695 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768699 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768704 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768707 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768711 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768717 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768721 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768725 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768729 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768732 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768736 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768739 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768743 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768747 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768751 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768754 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768758 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:54.768920 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768762 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768766 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768769 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768773 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768777 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768780 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768785 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768789 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768793 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768796 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768800 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768804 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768809 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768813 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768817 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768823 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768827 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768831 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768835 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768839 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:54.769722 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768843 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768850 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768856 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768860 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768864 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768868 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768872 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768876 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768880 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768884 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768888 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768892 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768896 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768902 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768906 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768910 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768915 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768919 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768923 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:54.770645 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768927 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768932 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768936 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768940 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768944 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768949 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768953 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768957 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768961 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768966 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768970 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768974 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768979 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768982 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768986 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768991 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.768996 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769000 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769004 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769026 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:54.771495 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769032 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769036 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769040 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769044 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769048 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769632 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769640 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769645 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769649 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769653 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769657 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769663 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769667 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769672 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769676 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769680 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769684 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769688 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769692 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:54.772171 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769697 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769701 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769705 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769709 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769713 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769717 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769721 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769725 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769729 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769734 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769738 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769742 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769746 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769758 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769763 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769768 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769772 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769776 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769780 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769784 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:54.772812 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769788 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769793 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769797 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769801 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769805 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769810 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769815 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769819 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769823 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769827 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769831 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769835 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769839 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769843 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769848 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769852 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769856 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769860 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769864 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769869 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:54.773331 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769873 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769886 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769891 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769895 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769900 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769904 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769908 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769912 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769917 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769921 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769924 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769929 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769937 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769943 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769948 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769953 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769961 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769968 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769974 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:54.773919 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769978 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769983 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769988 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769992 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.769996 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770000 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770004 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770024 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770029 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770032 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770038 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770043 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.770047 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770152 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770163 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770173 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770180 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770188 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770194 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770201 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770208 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:04:54.774408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770213 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770219 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770224 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770230 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770235 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770240 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770245 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770249 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770256 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770261 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770266 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770272 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770277 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770282 2575 flags.go:64] FLAG: --config-dir="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770287 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770292 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770298 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770303 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770308 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770313 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770318 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770323 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770327 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770333 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770337 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:04:54.774955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770344 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770349 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770356 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770361 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770366 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770371 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770378 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770383 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770388 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770393 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770398 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770405 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770409 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770414 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770419 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770426 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770430 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770435 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770440 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770444 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770449 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770454 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770461 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770466 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770471 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:04:54.775578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770476 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770481 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770486 2575 flags.go:64] FLAG: --help="false" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770490 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-138-154.ec2.internal" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770495 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770500 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770505 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770510 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770517 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770523 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770528 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770532 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770537 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770542 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770547 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770552 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770556 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770561 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770566 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770570 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770575 2575 flags.go:64] FLAG: --lock-file="" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770579 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770586 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770591 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:04:54.776202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770600 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770605 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770610 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770615 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770621 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770626 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770631 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770635 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770642 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770646 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770653 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770657 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770661 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770666 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770670 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770675 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770682 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770687 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770698 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770704 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770709 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770714 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770718 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:04:54.776826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770729 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770733 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770738 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770743 2575 flags.go:64] FLAG: --port="10250" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770748 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770752 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06ac7796ded54f0a5" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770757 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770762 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770769 2575 flags.go:64] FLAG: --register-node="true" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770774 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770778 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770784 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770789 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770794 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770798 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770804 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770809 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770814 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770819 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770824 2575 flags.go:64] FLAG: --runonce="false" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770829 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770834 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770839 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770843 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770848 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770858 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:04:54.777390 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770863 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770868 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770873 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770878 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770882 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770887 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770892 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770897 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770902 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770910 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770915 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770919 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770925 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770929 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770936 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770941 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770945 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770950 2575 flags.go:64] FLAG: --v="2" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770956 2575 flags.go:64] FLAG: --version="false" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770966 2575 flags.go:64] FLAG: --vmodule="" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770974 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.770980 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771141 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771149 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:54.778397 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771153 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771157 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771162 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771166 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771170 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771175 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771179 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771185 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771189 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771193 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771197 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771202 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771206 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771211 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771216 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771220 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771224 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771228 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771232 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771236 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:54.778961 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771240 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771244 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771250 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771257 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771263 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771268 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771272 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771276 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771280 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771285 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771289 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771294 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771297 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771304 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771310 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771321 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771325 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771330 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771335 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771341 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:54.779484 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771345 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771349 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771353 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771357 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771361 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771366 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771371 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771375 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771379 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771383 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771387 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771392 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771396 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771400 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771406 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771410 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771414 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771418 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771422 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:54.779972 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771426 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771430 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771434 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771439 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771443 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771447 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771451 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771455 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771459 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771463 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771468 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771472 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771477 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771481 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771485 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771490 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771494 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771497 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771501 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:54.780507 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771506 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771510 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771514 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771517 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771522 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.771525 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.771534 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.779316 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.779333 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779379 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779384 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779388 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779391 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779394 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779397 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779400 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:54.780966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779403 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779406 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779409 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779412 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779415 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779417 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779420 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779422 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779425 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779428 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779430 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779433 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779435 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779438 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779441 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779444 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779447 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779449 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779453 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:54.781376 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779455 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779458 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779460 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779463 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779466 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779469 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779472 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779475 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779478 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779481 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779484 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779487 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779490 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779492 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779495 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779498 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779502 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779507 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779510 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779512 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:54.781840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779515 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779518 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779521 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779523 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779526 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779528 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779531 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779534 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779537 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779540 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779543 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779545 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779547 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779550 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779553 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779555 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779558 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779561 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779564 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779568 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:54.782353 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779572 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779575 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779578 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779580 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779583 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779585 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779588 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779590 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779593 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779596 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779599 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779601 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779604 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779606 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779609 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779612 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779614 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779618 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779620 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:54.782840 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779623 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.779628 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779722 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779727 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779731 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779736 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779739 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779742 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779745 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779748 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779752 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779754 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779757 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779760 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779763 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:54.783323 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779766 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779768 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779771 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779774 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779777 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779780 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779783 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779786 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779788 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779791 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779794 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779798 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779802 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779805 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779807 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779810 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779812 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779815 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779817 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:54.783700 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779820 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779822 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779825 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779827 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779829 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779832 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779835 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779837 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779840 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779842 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779845 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779847 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779850 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779852 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779855 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779857 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779860 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779862 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779864 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779867 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:54.784251 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779869 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779872 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779874 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779877 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779880 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779883 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779885 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779888 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779890 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779893 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779895 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779898 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779900 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779903 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779905 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779908 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779910 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779913 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779915 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779918 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:54.784737 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779920 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779923 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779926 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779928 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779931 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779933 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779936 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779938 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779941 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779943 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779946 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779948 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779951 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:54.779953 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.779958 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:54.785226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.780796 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:04:54.785607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.784349 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:04:54.785607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.785450 2575 server.go:1019] "Starting client certificate rotation" Apr 16 22:04:54.785607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.785540 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:54.785607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.785582 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:54.811232 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.811215 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:54.817572 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.817555 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:54.831323 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.831298 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:04:54.838929 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.838910 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:54.839443 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.839430 2575 log.go:25] "Validated CRI v1 image API" Apr 16 22:04:54.840598 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.840581 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:04:54.842760 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.842742 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b00b8610-53f0-4004-aa35-7c6cd43aa108:/dev/nvme0n1p4 d93c8add-1af5-4c67-8569-addd09419f3a:/dev/nvme0n1p3] Apr 16 22:04:54.842832 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.842760 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:04:54.849122 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.849025 2575 manager.go:217] Machine: {Timestamp:2026-04-16 22:04:54.846928528 +0000 UTC m=+0.426431250 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100478 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b88f64c76fca917815e67b330644b SystemUUID:ec2b88f6-4c76-fca9-1781-5e67b330644b BootID:fadeb04a-f3f3-4ddc-9418-d2ce61526af6 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8b:23:67:d6:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8b:23:67:d6:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:fd:00:4a:2e:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:04:54.849122 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.849118 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:04:54.849235 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.849224 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:04:54.852036 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.852003 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:04:54.852167 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.852039 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-154.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:04:54.852214 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.852177 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:04:54.852214 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.852186 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:04:54.852214 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.852202 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:54.853229 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.853219 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:54.854689 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.854679 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:54.854796 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.854787 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:04:54.857393 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857384 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:04:54.857425 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857397 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:04:54.857425 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857412 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:04:54.857491 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857429 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:04:54.857491 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857442 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:04:54.857588 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.857554 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6xnqk" Apr 16 22:04:54.858598 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.858587 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:54.858638 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.858606 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:54.861623 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.861610 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:04:54.862539 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.862524 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6xnqk" Apr 16 22:04:54.862923 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.862910 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864769 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864788 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864794 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864799 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864805 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864810 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:04:54.864812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864816 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:04:54.865075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864824 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:04:54.865075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864831 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:04:54.865075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864837 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:04:54.865075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864857 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:04:54.865075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.864869 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:04:54.865738 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.865729 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:04:54.865781 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.865739 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:04:54.869100 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.869087 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:04:54.869162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.869119 2575 server.go:1295] "Started kubelet" Apr 16 22:04:54.869292 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.869242 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:04:54.869375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.869285 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:04:54.869375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.869320 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:04:54.869912 ip-10-0-138-154 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:04:54.871116 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.871096 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:04:54.872200 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.872180 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:04:54.872498 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.872468 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:54.877137 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.877120 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:54.877825 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.877805 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-154.ec2.internal" not found Apr 16 22:04:54.878111 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.878099 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:04:54.878901 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.878887 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:04:54.878978 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.878962 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:04:54.879043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.878979 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:04:54.879093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.879065 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:04:54.879093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.879076 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:04:54.879354 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.879335 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:04:54.880923 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.880898 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-154.ec2.internal\" not found" Apr 16 22:04:54.881032 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.880990 2575 factory.go:55] Registering systemd factory Apr 16 22:04:54.881032 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881030 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:04:54.881303 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881283 2575 factory.go:153] Registering CRI-O factory Apr 16 22:04:54.881303 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881303 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 22:04:54.881303 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881304 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:54.881483 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881359 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:04:54.881483 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881384 2575 factory.go:103] Registering Raw factory Apr 16 22:04:54.881483 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881402 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 22:04:54.881679 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881660 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:54.881909 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.881896 2575 manager.go:319] Starting recovery of all containers Apr 16 22:04:54.883425 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.883400 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-154.ec2.internal\" not found" node="ip-10-0-138-154.ec2.internal" Apr 16 22:04:54.891431 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.891418 2575 manager.go:324] Recovery completed Apr 16 22:04:54.892676 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.892661 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-154.ec2.internal" not found Apr 16 22:04:54.895387 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.895374 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:54.897354 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.897339 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:54.897429 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.897365 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:54.897429 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.897375 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:54.898251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.898236 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:04:54.898251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.898249 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:04:54.898372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.898268 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:54.900524 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.900510 2575 policy_none.go:49] "None policy: Start" Apr 16 22:04:54.900603 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.900529 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:04:54.900603 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.900542 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:04:54.935593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.935579 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.935631 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.935663 2575 server.go:85] "Starting device plugin registration server" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.935858 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.935870 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.936034 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.936101 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.936109 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.936512 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:04:54.951594 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:54.936544 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-154.ec2.internal\" not found" Apr 16 22:04:54.953311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:54.953297 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-154.ec2.internal" not found Apr 16 22:04:55.006145 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.006123 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:04:55.007224 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.007205 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:04:55.007306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.007228 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:04:55.007306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.007243 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:04:55.007306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.007249 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:04:55.007445 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.007312 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:04:55.009953 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.009934 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:55.036845 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.036807 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:55.037848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.037827 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:55.037924 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.037859 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:55.037924 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.037870 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:55.037924 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.037891 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.046619 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.046606 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.108051 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.107977 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal"] Apr 16 22:04:55.110556 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.110540 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.110636 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.110547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.137533 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.137512 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.141585 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.141571 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.150077 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.150063 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:55.152202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.152188 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:55.180223 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.180201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.180305 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.180228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.180305 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.180247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be7bc173f8b93cf725d465cb6fdf2be8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-154.ec2.internal\" (UID: \"be7bc173f8b93cf725d465cb6fdf2be8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281360 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be7bc173f8b93cf725d465cb6fdf2be8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-154.ec2.internal\" (UID: \"be7bc173f8b93cf725d465cb6fdf2be8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82db8e0a501a3255f6bfb417b6a9cc94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal\" (UID: \"82db8e0a501a3255f6bfb417b6a9cc94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.281452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.281420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be7bc173f8b93cf725d465cb6fdf2be8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-154.ec2.internal\" (UID: \"be7bc173f8b93cf725d465cb6fdf2be8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.452184 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.452119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.455896 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.455882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" Apr 16 22:04:55.785124 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.785045 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:04:55.785777 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.785190 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:55.785777 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.785221 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:55.785777 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.785221 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:55.857810 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.857787 2575 apiserver.go:52] "Watching apiserver" Apr 16 22:04:55.863360 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.863339 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:04:55.864219 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.864191 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 21:59:54 +0000 UTC" deadline="2027-11-14 19:03:05.558990479 +0000 UTC" Apr 16 22:04:55.864219 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.864215 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13844h58m9.694777542s" Apr 16 22:04:55.864625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.864604 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wqzqv","openshift-ovn-kubernetes/ovnkube-node-4tdqb","kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal","openshift-cluster-node-tuning-operator/tuned-65pb8","openshift-dns/node-resolver-w2p9p","openshift-image-registry/node-ca-ffv5n","openshift-multus/multus-additional-cni-plugins-7p5pj","openshift-multus/multus-vj2qt","openshift-network-diagnostics/network-check-target-t9hwb","openshift-network-operator/iptables-alerter-mq6b6","kube-system/konnectivity-agent-7xr5k","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal"] Apr 16 22:04:55.866161 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.866138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:55.866258 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.866211 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:04:55.867318 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.867297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.868663 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.868645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.869492 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:04:55.869492 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869359 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:04:55.869492 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869442 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.869492 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869473 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:04:55.869803 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.870210 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kgszc\"" Apr 16 22:04:55.870210 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.869970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.870210 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.870123 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:04:55.870491 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.870474 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zt5gz\"" Apr 16 22:04:55.870561 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.870536 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.871508 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.871492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.871671 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.871650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.872324 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.872311 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.872619 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.872604 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.872863 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.872848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.873898 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.873725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d7w2z\"" Apr 16 22:04:55.874034 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.873998 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:04:55.874704 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.874501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.874704 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.874534 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.874855 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.874811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bx2nq\"" Apr 16 22:04:55.875475 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.875036 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:04:55.875475 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.875114 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.875475 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.875261 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:04:55.875475 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.875437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9vld4\"" Apr 16 22:04:55.876499 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.876479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:55.876606 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.876567 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:04:55.876658 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.876440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.876726 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.876707 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:04:55.876979 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.876960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.877551 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.877376 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:55.877838 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.877815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.878490 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.878473 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:04:55.878574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.878564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ldz7x\"" Apr 16 22:04:55.879241 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.879224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.879709 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.879682 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.879806 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.879789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.880103 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.880083 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:04:55.880181 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.880130 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ckwwq\"" Apr 16 22:04:55.880631 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.880612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.880980 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.880965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zjszl\"" Apr 16 22:04:55.881152 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.881137 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:04:55.881152 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.881146 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:04:55.882689 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.882452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:04:55.882689 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.882501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:04:55.882689 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.882564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bmw9x\"" Apr 16 22:04:55.882689 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.882585 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:04:55.885508 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.885642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.885642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.885642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xr7h\" (UniqueName: \"kubernetes.io/projected/368f7f53-a095-41a5-b3f1-ce5057f3c97b-kube-api-access-5xr7h\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.885642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-systemd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-system-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-ovn\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27pf\" (UniqueName: \"kubernetes.io/projected/f30d322b-dffa-40ed-b571-c4015d6c53dd-kube-api-access-z27pf\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.885848 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74bc6\" (UniqueName: \"kubernetes.io/projected/ded33a78-e95e-4a1a-97d0-f06ac24a881a-kube-api-access-74bc6\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-conf\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-run\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcqrq\" (UniqueName: \"kubernetes.io/projected/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-kube-api-access-zcqrq\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-os-release\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.885957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjbt\" (UniqueName: \"kubernetes.io/projected/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-kube-api-access-2fjbt\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-bin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-config\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-etc-kubernetes\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd8aeba-bb32-4282-9d26-de3df1a80218-host-slash\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-kubernetes\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-bin\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovn-node-metrics-cert\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-var-lib-kubelet\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkmt\" (UniqueName: \"kubernetes.io/projected/4ad8b845-f3d7-4afe-a815-787bb7f69564-kube-api-access-mbkmt\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-systemd-units\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-node-log\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-script-lib\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-sys\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-cnibin\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cnibin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-socket-dir-parent\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-kubelet\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-modprobe-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad8b845-f3d7-4afe-a815-787bb7f69564-hosts-file\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-netns\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.886677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-hostroot\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-conf-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-var-lib-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-log-socket\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-netd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-tuned\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-system-cni-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkdm\" (UniqueName: \"kubernetes.io/projected/0bd8aeba-bb32-4282-9d26-de3df1a80218-kube-api-access-jkkdm\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-os-release\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-multus\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-kubelet\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-daemon-config\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-multus-certs\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7dd7ffd9-34ba-4ace-964e-e902616fd753-konnectivity-ca\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-slash\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.886993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-netns\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.887430 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-lib-modules\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-host\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjgd\" (UniqueName: \"kubernetes.io/projected/ab805ae8-a410-4297-ba4b-b2d47e46aa56-kube-api-access-qpjgd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad8b845-f3d7-4afe-a815-787bb7f69564-tmp-dir\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cni-binary-copy\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-env-overrides\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysconfig\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-tmp\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-k8s-cni-cncf-io\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887361 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7dd7ffd9-34ba-4ace-964e-e902616fd753-agent-certs\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-host\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-serviceca\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-systemd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.888162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-etc-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.888967 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.887578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd8aeba-bb32-4282-9d26-de3df1a80218-iptables-alerter-script\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.907951 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.907924 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-c2sxq" Apr 16 22:04:55.915150 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.915132 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-c2sxq" Apr 16 22:04:55.973976 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:55.973950 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe7bc173f8b93cf725d465cb6fdf2be8.slice/crio-04ecaeb90c9f217e0b837047f01250e3dd9d0ef6110d760a9cf569c897372a3b WatchSource:0}: Error finding container 04ecaeb90c9f217e0b837047f01250e3dd9d0ef6110d760a9cf569c897372a3b: Status 404 returned error can't find the container with id 04ecaeb90c9f217e0b837047f01250e3dd9d0ef6110d760a9cf569c897372a3b Apr 16 22:04:55.974245 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:55.974228 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82db8e0a501a3255f6bfb417b6a9cc94.slice/crio-0d916087225be851e1c2930b98acb237cf396871b2ee629d532952769b8488ff WatchSource:0}: Error finding container 0d916087225be851e1c2930b98acb237cf396871b2ee629d532952769b8488ff: Status 404 returned error can't find the container with id 0d916087225be851e1c2930b98acb237cf396871b2ee629d532952769b8488ff Apr 16 22:04:55.978625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.978608 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:04:55.979468 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.979408 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:04:55.987766 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-system-cni-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.987858 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkdm\" (UniqueName: \"kubernetes.io/projected/0bd8aeba-bb32-4282-9d26-de3df1a80218-kube-api-access-jkkdm\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.987858 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-os-release\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.987957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-system-cni-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.987957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-os-release\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.987957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-multus\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.987957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-kubelet\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.987957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-multus\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-daemon-config\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-multus-certs\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.987993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-kubelet\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-multus-certs\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7dd7ffd9-34ba-4ace-964e-e902616fd753-konnectivity-ca\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-socket-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.988198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-slash\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-netns\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-slash\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-lib-modules\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-netns\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-host\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjgd\" (UniqueName: \"kubernetes.io/projected/ab805ae8-a410-4297-ba4b-b2d47e46aa56-kube-api-access-qpjgd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad8b845-f3d7-4afe-a815-787bb7f69564-tmp-dir\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-host\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cni-binary-copy\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-env-overrides\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysconfig\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-lib-modules\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.988518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-tmp\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-daemon-config\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-k8s-cni-cncf-io\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7dd7ffd9-34ba-4ace-964e-e902616fd753-agent-certs\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-sys-fs\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-host\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-serviceca\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-registration-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-systemd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-etc-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7dd7ffd9-34ba-4ace-964e-e902616fd753-konnectivity-ca\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd8aeba-bb32-4282-9d26-de3df1a80218-iptables-alerter-script\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cni-binary-copy\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988839 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.988882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ad8b845-f3d7-4afe-a815-787bb7f69564-tmp-dir\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.989260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-systemd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-k8s-cni-cncf-io\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-env-overrides\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-host\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysconfig\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-etc-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq9bm\" (UniqueName: \"kubernetes.io/projected/02699880-b983-4831-851e-01a87a783b1c-kube-api-access-sq9bm\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xr7h\" (UniqueName: \"kubernetes.io/projected/368f7f53-a095-41a5-b3f1-ce5057f3c97b-kube-api-access-5xr7h\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-systemd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-serviceca\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-system-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-ovn\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z27pf\" (UniqueName: \"kubernetes.io/projected/f30d322b-dffa-40ed-b571-c4015d6c53dd-kube-api-access-z27pf\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd8aeba-bb32-4282-9d26-de3df1a80218-iptables-alerter-script\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74bc6\" (UniqueName: \"kubernetes.io/projected/ded33a78-e95e-4a1a-97d0-f06ac24a881a-kube-api-access-74bc6\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-conf\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-run\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcqrq\" (UniqueName: \"kubernetes.io/projected/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-kube-api-access-zcqrq\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-os-release\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-system-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjbt\" (UniqueName: \"kubernetes.io/projected/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-kube-api-access-2fjbt\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-ovn\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-bin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-config\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-systemd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:55.990867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.989997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-etc-kubernetes\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd8aeba-bb32-4282-9d26-de3df1a80218-host-slash\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-kubernetes\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-cni-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-bin\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovn-node-metrics-cert\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-run-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-var-lib-kubelet\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkmt\" (UniqueName: \"kubernetes.io/projected/4ad8b845-f3d7-4afe-a815-787bb7f69564-kube-api-access-mbkmt\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-var-lib-kubelet\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-systemd-units\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-systemd-units\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.990335 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.990411 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:56.490387997 +0000 UTC m=+2.069890703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-etc-kubernetes\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.991721 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd8aeba-bb32-4282-9d26-de3df1a80218-host-slash\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-os-release\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-kubernetes\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-bin\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-node-log\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-script-lib\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-conf\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-config\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-sysctl-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-node-log\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-sys\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-run\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-var-lib-cni-bin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-sys\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-cnibin\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cnibin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-socket-dir-parent\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.992651 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-kubelet\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-cnibin\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.990904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f30d322b-dffa-40ed-b571-c4015d6c53dd-cnibin\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-modprobe-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-socket-dir-parent\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad8b845-f3d7-4afe-a815-787bb7f69564-hosts-file\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-kubelet\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-netns\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-hostroot\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ad8b845-f3d7-4afe-a815-787bb7f69564-hosts-file\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-conf-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-modprobe-d\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-device-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-hostroot\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-var-lib-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-multus-conf-dir\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovnkube-script-lib\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.993574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ded33a78-e95e-4a1a-97d0-f06ac24a881a-host-run-netns\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-log-socket\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-var-lib-openvswitch\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-netd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-tuned\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-host-cni-netd\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/368f7f53-a095-41a5-b3f1-ce5057f3c97b-log-socket\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.991709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f30d322b-dffa-40ed-b571-c4015d6c53dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.992547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-tmp\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.992889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7dd7ffd9-34ba-4ace-964e-e902616fd753-agent-certs\") pod \"konnectivity-agent-7xr5k\" (UID: \"7dd7ffd9-34ba-4ace-964e-e902616fd753\") " pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.993246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/368f7f53-a095-41a5-b3f1-ce5057f3c97b-ovn-node-metrics-cert\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.994338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.993888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab805ae8-a410-4297-ba4b-b2d47e46aa56-etc-tuned\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.995355 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.995336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkdm\" (UniqueName: \"kubernetes.io/projected/0bd8aeba-bb32-4282-9d26-de3df1a80218-kube-api-access-jkkdm\") pod \"iptables-alerter-mq6b6\" (UID: \"0bd8aeba-bb32-4282-9d26-de3df1a80218\") " pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:55.995737 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.995707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjgd\" (UniqueName: \"kubernetes.io/projected/ab805ae8-a410-4297-ba4b-b2d47e46aa56-kube-api-access-qpjgd\") pod \"tuned-65pb8\" (UID: \"ab805ae8-a410-4297-ba4b-b2d47e46aa56\") " pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:55.996185 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.996167 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:04:55.997198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.997177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74bc6\" (UniqueName: \"kubernetes.io/projected/ded33a78-e95e-4a1a-97d0-f06ac24a881a-kube-api-access-74bc6\") pod \"multus-vj2qt\" (UID: \"ded33a78-e95e-4a1a-97d0-f06ac24a881a\") " pod="openshift-multus/multus-vj2qt" Apr 16 22:04:55.997369 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.997179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xr7h\" (UniqueName: \"kubernetes.io/projected/368f7f53-a095-41a5-b3f1-ce5057f3c97b-kube-api-access-5xr7h\") pod \"ovnkube-node-4tdqb\" (UID: \"368f7f53-a095-41a5-b3f1-ce5057f3c97b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:55.997702 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:55.997678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27pf\" (UniqueName: \"kubernetes.io/projected/f30d322b-dffa-40ed-b571-c4015d6c53dd-kube-api-access-z27pf\") pod \"multus-additional-cni-plugins-7p5pj\" (UID: \"f30d322b-dffa-40ed-b571-c4015d6c53dd\") " pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:55.999629 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.999607 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:55.999723 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.999633 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:55.999723 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.999646 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:55.999723 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:55.999698 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:56.499680257 +0000 UTC m=+2.079182975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:56.001933 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.001907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkmt\" (UniqueName: \"kubernetes.io/projected/4ad8b845-f3d7-4afe-a815-787bb7f69564-kube-api-access-mbkmt\") pod \"node-resolver-w2p9p\" (UID: \"4ad8b845-f3d7-4afe-a815-787bb7f69564\") " pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:56.002513 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.002491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjbt\" (UniqueName: \"kubernetes.io/projected/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-kube-api-access-2fjbt\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:56.002959 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.002933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcqrq\" (UniqueName: \"kubernetes.io/projected/4a1cd3c5-4d03-444e-82c3-29cdb850d6cf-kube-api-access-zcqrq\") pod \"node-ca-ffv5n\" (UID: \"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf\") " pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:56.003676 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.003654 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd7ffd9_34ba_4ace_964e_e902616fd753.slice/crio-3f3abe703ba9bed2dd28cd9c8a817f7d374a85b49f2cbd8234adcc66a1a3fb4e WatchSource:0}: Error finding container 3f3abe703ba9bed2dd28cd9c8a817f7d374a85b49f2cbd8234adcc66a1a3fb4e: Status 404 returned error can't find the container with id 3f3abe703ba9bed2dd28cd9c8a817f7d374a85b49f2cbd8234adcc66a1a3fb4e Apr 16 22:04:56.010847 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.010793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" event={"ID":"82db8e0a501a3255f6bfb417b6a9cc94","Type":"ContainerStarted","Data":"0d916087225be851e1c2930b98acb237cf396871b2ee629d532952769b8488ff"} Apr 16 22:04:56.012415 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.012388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7xr5k" event={"ID":"7dd7ffd9-34ba-4ace-964e-e902616fd753","Type":"ContainerStarted","Data":"3f3abe703ba9bed2dd28cd9c8a817f7d374a85b49f2cbd8234adcc66a1a3fb4e"} Apr 16 22:04:56.013282 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.013262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" event={"ID":"be7bc173f8b93cf725d465cb6fdf2be8","Type":"ContainerStarted","Data":"04ecaeb90c9f217e0b837047f01250e3dd9d0ef6110d760a9cf569c897372a3b"} Apr 16 22:04:56.091826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.091826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-device-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.091826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-socket-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-device-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-sys-fs\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.091988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-registration-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.092045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq9bm\" (UniqueName: \"kubernetes.io/projected/02699880-b983-4831-851e-01a87a783b1c-kube-api-access-sq9bm\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092308 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.092091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-registration-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092308 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.092144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-sys-fs\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092308 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.092153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-socket-dir\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.092308 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.092271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02699880-b983-4831-851e-01a87a783b1c-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.098624 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.098608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq9bm\" (UniqueName: \"kubernetes.io/projected/02699880-b983-4831-851e-01a87a783b1c-kube-api-access-sq9bm\") pod \"aws-ebs-csi-driver-node-gmc98\" (UID: \"02699880-b983-4831-851e-01a87a783b1c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.186598 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.186573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:04:56.193123 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.193101 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368f7f53_a095_41a5_b3f1_ce5057f3c97b.slice/crio-f14f8a111a7c00f80f7f98f28e30bd2ae43a1d4650676df01f5d412071f59878 WatchSource:0}: Error finding container f14f8a111a7c00f80f7f98f28e30bd2ae43a1d4650676df01f5d412071f59878: Status 404 returned error can't find the container with id f14f8a111a7c00f80f7f98f28e30bd2ae43a1d4650676df01f5d412071f59878 Apr 16 22:04:56.204787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.204761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65pb8" Apr 16 22:04:56.210650 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.210630 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab805ae8_a410_4297_ba4b_b2d47e46aa56.slice/crio-b69388c6ecd847bc479533bd3a379930e7d26db6066893a708460dcfc5253a7e WatchSource:0}: Error finding container b69388c6ecd847bc479533bd3a379930e7d26db6066893a708460dcfc5253a7e: Status 404 returned error can't find the container with id b69388c6ecd847bc479533bd3a379930e7d26db6066893a708460dcfc5253a7e Apr 16 22:04:56.216699 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.216682 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2p9p" Apr 16 22:04:56.223848 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.223827 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad8b845_f3d7_4afe_a815_787bb7f69564.slice/crio-678c21dd843674ae775ac65dd943f34d0e604c10c50ba55f7cc39a42e3e933a5 WatchSource:0}: Error finding container 678c21dd843674ae775ac65dd943f34d0e604c10c50ba55f7cc39a42e3e933a5: Status 404 returned error can't find the container with id 678c21dd843674ae775ac65dd943f34d0e604c10c50ba55f7cc39a42e3e933a5 Apr 16 22:04:56.232958 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.232942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffv5n" Apr 16 22:04:56.238928 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.238909 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1cd3c5_4d03_444e_82c3_29cdb850d6cf.slice/crio-93e1c1344cdf1de2307db373e8b6d7c2a919f5ec3cce983185ab18bc32feb71e WatchSource:0}: Error finding container 93e1c1344cdf1de2307db373e8b6d7c2a919f5ec3cce983185ab18bc32feb71e: Status 404 returned error can't find the container with id 93e1c1344cdf1de2307db373e8b6d7c2a919f5ec3cce983185ab18bc32feb71e Apr 16 22:04:56.251185 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.251165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" Apr 16 22:04:56.257578 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.257556 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf30d322b_dffa_40ed_b571_c4015d6c53dd.slice/crio-03a61ff1f93942c2c3c21978d8696abe5e3ac9343d9a547b985fc164e1a1dde6 WatchSource:0}: Error finding container 03a61ff1f93942c2c3c21978d8696abe5e3ac9343d9a547b985fc164e1a1dde6: Status 404 returned error can't find the container with id 03a61ff1f93942c2c3c21978d8696abe5e3ac9343d9a547b985fc164e1a1dde6 Apr 16 22:04:56.281076 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.281057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vj2qt" Apr 16 22:04:56.286740 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.286715 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podded33a78_e95e_4a1a_97d0_f06ac24a881a.slice/crio-a53d6bd6cbb686ea5dd18c8bb3e99c3cc8a4f0ffbfd46741ffe5e568d9b5cfdb WatchSource:0}: Error finding container a53d6bd6cbb686ea5dd18c8bb3e99c3cc8a4f0ffbfd46741ffe5e568d9b5cfdb: Status 404 returned error can't find the container with id a53d6bd6cbb686ea5dd18c8bb3e99c3cc8a4f0ffbfd46741ffe5e568d9b5cfdb Apr 16 22:04:56.287748 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.287689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mq6b6" Apr 16 22:04:56.293179 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.293159 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd8aeba_bb32_4282_9d26_de3df1a80218.slice/crio-bb58afc6d1c9cfec251d2a3b178121da415d1cf01ee4584dd7f90ff490914004 WatchSource:0}: Error finding container bb58afc6d1c9cfec251d2a3b178121da415d1cf01ee4584dd7f90ff490914004: Status 404 returned error can't find the container with id bb58afc6d1c9cfec251d2a3b178121da415d1cf01ee4584dd7f90ff490914004 Apr 16 22:04:56.302069 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.302049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" Apr 16 22:04:56.307670 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:04:56.307651 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02699880_b983_4831_851e_01a87a783b1c.slice/crio-d564df19035f5e81dd84b4a74adbb17426bbb0cc1c0bb32da6128407d507be6f WatchSource:0}: Error finding container d564df19035f5e81dd84b4a74adbb17426bbb0cc1c0bb32da6128407d507be6f: Status 404 returned error can't find the container with id d564df19035f5e81dd84b4a74adbb17426bbb0cc1c0bb32da6128407d507be6f Apr 16 22:04:56.449888 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.449814 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-24qxc"] Apr 16 22:04:56.451730 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.451708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.451847 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.451789 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:04:56.494069 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.494035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-dbus\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.494211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.494101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-kubelet-config\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.494211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.494130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.494211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.494202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:56.494359 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.494343 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:56.494466 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.494399 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.4943754 +0000 UTC m=+3.073878124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.594968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.595064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-dbus\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.595121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-kubelet-config\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.595147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595286 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595342 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.095324871 +0000 UTC m=+2.674827581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595628 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595644 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595656 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:56.595699 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.595684419 +0000 UTC m=+3.175187138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.595768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-dbus\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.596299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.595821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-kubelet-config\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:56.884420 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.884345 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:56.917957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.916730 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:55 +0000 UTC" deadline="2027-11-13 20:30:28.880884946 +0000 UTC" Apr 16 22:04:56.917957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:56.916765 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13822h25m31.964123534s" Apr 16 22:04:57.029917 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.029846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2p9p" event={"ID":"4ad8b845-f3d7-4afe-a815-787bb7f69564","Type":"ContainerStarted","Data":"678c21dd843674ae775ac65dd943f34d0e604c10c50ba55f7cc39a42e3e933a5"} Apr 16 22:04:57.037301 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.037272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65pb8" event={"ID":"ab805ae8-a410-4297-ba4b-b2d47e46aa56","Type":"ContainerStarted","Data":"b69388c6ecd847bc479533bd3a379930e7d26db6066893a708460dcfc5253a7e"} Apr 16 22:04:57.044831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.044802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"f14f8a111a7c00f80f7f98f28e30bd2ae43a1d4650676df01f5d412071f59878"} Apr 16 22:04:57.053106 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.053078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerStarted","Data":"03a61ff1f93942c2c3c21978d8696abe5e3ac9343d9a547b985fc164e1a1dde6"} Apr 16 22:04:57.062996 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.062066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" event={"ID":"02699880-b983-4831-851e-01a87a783b1c","Type":"ContainerStarted","Data":"d564df19035f5e81dd84b4a74adbb17426bbb0cc1c0bb32da6128407d507be6f"} Apr 16 22:04:57.074800 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.074765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mq6b6" event={"ID":"0bd8aeba-bb32-4282-9d26-de3df1a80218","Type":"ContainerStarted","Data":"bb58afc6d1c9cfec251d2a3b178121da415d1cf01ee4584dd7f90ff490914004"} Apr 16 22:04:57.079649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.079622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vj2qt" event={"ID":"ded33a78-e95e-4a1a-97d0-f06ac24a881a","Type":"ContainerStarted","Data":"a53d6bd6cbb686ea5dd18c8bb3e99c3cc8a4f0ffbfd46741ffe5e568d9b5cfdb"} Apr 16 22:04:57.084061 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.084035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffv5n" event={"ID":"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf","Type":"ContainerStarted","Data":"93e1c1344cdf1de2307db373e8b6d7c2a919f5ec3cce983185ab18bc32feb71e"} Apr 16 22:04:57.098795 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.098770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:57.098947 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.098939 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:57.099050 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.099022 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:04:58.098978235 +0000 UTC m=+3.678480945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:57.116046 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.115999 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:57.355064 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.354755 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:57.501638 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.501596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:57.505992 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.505961 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.506148 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.506071 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:59.506048765 +0000 UTC m=+5.085551483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.602571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.602521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:57.602755 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.602700 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:57.602755 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.602718 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:57.602755 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.602731 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.602923 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:57.602786 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:59.602768469 +0000 UTC m=+5.182271181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.917737 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.917662 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:55 +0000 UTC" deadline="2027-09-18 03:46:39.709394402 +0000 UTC" Apr 16 22:04:57.917737 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:57.917705 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12461h41m41.791693189s" Apr 16 22:04:58.008488 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:58.008456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:58.008671 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:58.008586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:04:58.009029 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:58.008991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:58.009126 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:58.009103 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:04:58.009212 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:58.009198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:58.009309 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:58.009290 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:04:58.107267 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:58.107215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:04:58.107452 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:58.107398 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:58.107527 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:58.107456 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:05:00.107437189 +0000 UTC m=+5.686939896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:59.519712 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:59.519676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:04:59.520227 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.519895 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:59.520227 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.519982 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:03.519962515 +0000 UTC m=+9.099465223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:59.621330 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:04:59.620770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:04:59.621330 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.620961 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:59.621330 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.620980 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:59.621330 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.620993 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:59.621330 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:04:59.621075 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:03.621056162 +0000 UTC m=+9.200558869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:00.008091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:00.008234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:00.008332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:00.008437 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:00.008499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:00.008634 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:00.008563 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:00.124828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:00.124245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:00.124828 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:00.124436 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:00.124828 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:00.124495 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:05:04.124475974 +0000 UTC m=+9.703978686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:02.007780 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:02.007678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:02.008244 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:02.007795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:02.008244 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:02.007849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:02.011173 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:02.007969 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:02.011173 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:02.009054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:02.011173 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:02.009194 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:03.552096 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:03.552058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:03.552562 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.552194 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:03.552562 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.552267 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:11.552246441 +0000 UTC m=+17.131749149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:03.652616 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:03.652535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:03.652855 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.652675 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:03.652855 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.652694 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:03.652855 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.652705 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:03.652855 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:03.652756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:11.652737331 +0000 UTC m=+17.232240047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:04.008529 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:04.008449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:04.008705 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:04.008449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:04.008705 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:04.008589 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:04.008705 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:04.008648 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:04.008705 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:04.008469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:04.008892 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:04.008731 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:04.156934 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:04.156782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:04.157155 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:04.157001 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:04.157155 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:04.157078 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:05:12.15705991 +0000 UTC m=+17.736562619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:06.007873 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:06.007838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:06.007873 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:06.007856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:06.008331 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:06.007964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:06.008331 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:06.007980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:06.008331 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:06.008063 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:06.008331 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:06.008146 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:08.007643 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:08.007611 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:08.008093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:08.007649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:08.008093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:08.007612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:08.008093 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:08.007740 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:08.008093 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:08.007803 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:08.008093 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:08.007861 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:10.008375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:10.008341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:10.008818 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:10.008342 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:10.008818 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:10.008470 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:10.008818 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:10.008342 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:10.008818 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:10.008542 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:10.008818 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:10.008634 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:11.614783 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:11.614745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:11.615267 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.614912 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:11.615267 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.614977 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:27.614961631 +0000 UTC m=+33.194464341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:11.715816 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:11.715778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:11.715997 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.715961 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:11.715997 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.715983 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:11.715997 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.715993 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:11.716154 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:11.716056 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:27.716038213 +0000 UTC m=+33.295540921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:12.008342 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:12.008311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:12.008605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:12.008353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:12.008605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:12.008387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:12.008605 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:12.008502 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:12.008763 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:12.008613 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:12.008763 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:12.008690 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:12.218824 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:12.218785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:12.219030 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:12.218930 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:12.219030 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:12.218998 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:05:28.218980084 +0000 UTC m=+33.798482796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:14.008203 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:14.008177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:14.008584 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:14.008269 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:14.008584 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:14.008278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:14.008584 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:14.008354 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:14.008584 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:14.008391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:14.008584 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:14.008450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:15.113092 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.112620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7xr5k" event={"ID":"7dd7ffd9-34ba-4ace-964e-e902616fd753","Type":"ContainerStarted","Data":"ed026a10b4b187f1d4a9b8a691beca8a4164804848d345b15f3e23232f991c6f"} Apr 16 22:05:15.113901 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.113879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" event={"ID":"be7bc173f8b93cf725d465cb6fdf2be8","Type":"ContainerStarted","Data":"c12237783a05df12f83a98b58725431cbb069a74d18c21b7c692091ee09b6b93"} Apr 16 22:05:15.115114 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.115094 2575 generic.go:358] "Generic (PLEG): container finished" podID="82db8e0a501a3255f6bfb417b6a9cc94" containerID="e3c3e90ed2365b013b5c4f07f43a5ec76a29d7b800b049f92fd81ea5bdcbec08" exitCode=0 Apr 16 22:05:15.115202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.115160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" event={"ID":"82db8e0a501a3255f6bfb417b6a9cc94","Type":"ContainerDied","Data":"e3c3e90ed2365b013b5c4f07f43a5ec76a29d7b800b049f92fd81ea5bdcbec08"} Apr 16 22:05:15.116356 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.116338 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="ec2c53fafbefc326d6d3ea5fe06a7bf582d6f0659d2aa35c8020c4731fce49a2" exitCode=0 Apr 16 22:05:15.116437 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.116397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"ec2c53fafbefc326d6d3ea5fe06a7bf582d6f0659d2aa35c8020c4731fce49a2"} Apr 16 22:05:15.117695 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.117678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" event={"ID":"02699880-b983-4831-851e-01a87a783b1c","Type":"ContainerStarted","Data":"e0622c5881dd680dfd3d3574cc522402e48bc4aa9f15c0846bbcebf2e3e77d1c"} Apr 16 22:05:15.118950 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.118927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vj2qt" event={"ID":"ded33a78-e95e-4a1a-97d0-f06ac24a881a","Type":"ContainerStarted","Data":"ad734f4af14c8848896bc6f333956b6b66efe9db4b5eb3ea6655305cdf4e9811"} Apr 16 22:05:15.120914 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.120885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffv5n" event={"ID":"4a1cd3c5-4d03-444e-82c3-29cdb850d6cf","Type":"ContainerStarted","Data":"86a74e9f4394084bd6d1766d2c540fb15f0639efbac5d5083ae8c5b95aee034f"} Apr 16 22:05:15.122773 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.122752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2p9p" event={"ID":"4ad8b845-f3d7-4afe-a815-787bb7f69564","Type":"ContainerStarted","Data":"9155a9815a0c033edb2eb0f37052d34d320c2a1010059acc6b839d9b4c6d750f"} Apr 16 22:05:15.124142 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.124113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65pb8" event={"ID":"ab805ae8-a410-4297-ba4b-b2d47e46aa56","Type":"ContainerStarted","Data":"25ec8f5481c013ae419cae3fe2ce1acc4ae0bf2af34da0befdeb99efb8726358"} Apr 16 22:05:15.125788 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.125734 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7xr5k" podStartSLOduration=2.160817106 podStartE2EDuration="20.125724839s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.00527972 +0000 UTC m=+1.584782428" lastFinishedPulling="2026-04-16 22:05:13.970187439 +0000 UTC m=+19.549690161" observedRunningTime="2026-04-16 22:05:15.125422713 +0000 UTC m=+20.704925440" watchObservedRunningTime="2026-04-16 22:05:15.125724839 +0000 UTC m=+20.705227566" Apr 16 22:05:15.126278 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:05:15.126536 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126518 2575 generic.go:358] "Generic (PLEG): container finished" podID="368f7f53-a095-41a5-b3f1-ce5057f3c97b" containerID="bc1cc464b740869971374010dd65f07de016dbc697e7287485637af9076d47bf" exitCode=1 Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"fc1743fd555c98c0f8faa32db4bf8c89e1c0e6add769268e4d809d7292a6cf86"} Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"1156d05c2f06e73b8a5e4e9dfd772cfd5fd3a44e0bd53e5f9d63d9a330f33ecf"} Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"2609424fb35d0a83a670b1802905d1c47d7ba358f88f7318a1887e702c3870f1"} Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"4144f68c5fba1cce8b204b5e3152ac67c750dc1c12e71ceeb9fd781450d2ace1"} Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerDied","Data":"bc1cc464b740869971374010dd65f07de016dbc697e7287485637af9076d47bf"} Apr 16 22:05:15.126593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.126594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"4276b7032bbd4db63a86679a03d9e240fc922152d1cedee0d09ec7d9ee01cb04"} Apr 16 22:05:15.149883 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.149850 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-154.ec2.internal" podStartSLOduration=20.149839456 podStartE2EDuration="20.149839456s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:15.137762512 +0000 UTC m=+20.717265240" watchObservedRunningTime="2026-04-16 22:05:15.149839456 +0000 UTC m=+20.729342179" Apr 16 22:05:15.149961 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.149946 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w2p9p" podStartSLOduration=2.403394346 podStartE2EDuration="20.149942513s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.225152008 +0000 UTC m=+1.804654714" lastFinishedPulling="2026-04-16 22:05:13.971700169 +0000 UTC m=+19.551202881" observedRunningTime="2026-04-16 22:05:15.149605018 +0000 UTC m=+20.729107756" watchObservedRunningTime="2026-04-16 22:05:15.149942513 +0000 UTC m=+20.729445240" Apr 16 22:05:15.179642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.179609 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vj2qt" podStartSLOduration=2.481604708 podStartE2EDuration="20.179599289s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.288131439 +0000 UTC m=+1.867634144" lastFinishedPulling="2026-04-16 22:05:13.986126014 +0000 UTC m=+19.565628725" observedRunningTime="2026-04-16 22:05:15.179586482 +0000 UTC m=+20.759089209" watchObservedRunningTime="2026-04-16 22:05:15.179599289 +0000 UTC m=+20.759102015" Apr 16 22:05:15.180084 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.180062 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ffv5n" podStartSLOduration=2.450208102 podStartE2EDuration="20.180056601s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.240354378 +0000 UTC m=+1.819857083" lastFinishedPulling="2026-04-16 22:05:13.970202862 +0000 UTC m=+19.549705582" observedRunningTime="2026-04-16 22:05:15.161003821 +0000 UTC m=+20.740506548" watchObservedRunningTime="2026-04-16 22:05:15.180056601 +0000 UTC m=+20.759559322" Apr 16 22:05:15.194068 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.194036 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-65pb8" podStartSLOduration=2.432893058 podStartE2EDuration="20.194028077s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.2119043 +0000 UTC m=+1.791407005" lastFinishedPulling="2026-04-16 22:05:13.973039307 +0000 UTC m=+19.552542024" observedRunningTime="2026-04-16 22:05:15.193788225 +0000 UTC m=+20.773290952" watchObservedRunningTime="2026-04-16 22:05:15.194028077 +0000 UTC m=+20.773530795" Apr 16 22:05:15.851660 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.851456 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:05:15.945918 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.945823 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:05:15.851657841Z","UUID":"2fb0c113-fbf9-4ef8-aa60-c2eb3fbaf589","Handler":null,"Name":"","Endpoint":""} Apr 16 22:05:15.947540 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.947517 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:05:15.947540 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:15.947543 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:05:16.008366 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.008342 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:16.008480 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.008373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:16.008480 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:16.008461 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:16.008605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.008499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:16.008651 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:16.008613 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:16.008705 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:16.008688 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:16.130173 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.130139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" event={"ID":"02699880-b983-4831-851e-01a87a783b1c","Type":"ContainerStarted","Data":"5e62cd71d79da83e88942d06de9ca7e8fc2ca982e4dcf6bc03a6d5ff66c055e9"} Apr 16 22:05:16.131770 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.131743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mq6b6" event={"ID":"0bd8aeba-bb32-4282-9d26-de3df1a80218","Type":"ContainerStarted","Data":"8daacd7dbde345036b601693db95e9b47f79f57d7aea221ab9aee039f4e5463e"} Apr 16 22:05:16.144398 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:16.144362 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mq6b6" podStartSLOduration=3.467841591 podStartE2EDuration="21.144350441s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.29422789 +0000 UTC m=+1.873730595" lastFinishedPulling="2026-04-16 22:05:13.970736725 +0000 UTC m=+19.550239445" observedRunningTime="2026-04-16 22:05:16.143891715 +0000 UTC m=+21.723394442" watchObservedRunningTime="2026-04-16 22:05:16.144350441 +0000 UTC m=+21.723853186" Apr 16 22:05:17.136509 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.136478 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:05:17.137230 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.136866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"dc0a6ed30ac28711cc781cf6c5f005d22cf49c31ffb0855b3bdf0fb0b46791d2"} Apr 16 22:05:17.138683 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.138654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" event={"ID":"82db8e0a501a3255f6bfb417b6a9cc94","Type":"ContainerStarted","Data":"eb26e25aa5451d94d4047665213bbd10303c233ae22d494abd34ebf53848ce51"} Apr 16 22:05:17.140669 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.140637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" event={"ID":"02699880-b983-4831-851e-01a87a783b1c","Type":"ContainerStarted","Data":"a158ad9bd33388c01e9c09fea2aa42a7587995f915d44b7871ee4696f58eb878"} Apr 16 22:05:17.151489 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.151451 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-154.ec2.internal" podStartSLOduration=22.151436028 podStartE2EDuration="22.151436028s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:17.151157157 +0000 UTC m=+22.730659884" watchObservedRunningTime="2026-04-16 22:05:17.151436028 +0000 UTC m=+22.730938756" Apr 16 22:05:17.168441 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:17.166781 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmc98" podStartSLOduration=1.84520213 podStartE2EDuration="22.166765822s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.308933727 +0000 UTC m=+1.888436432" lastFinishedPulling="2026-04-16 22:05:16.6304974 +0000 UTC m=+22.210000124" observedRunningTime="2026-04-16 22:05:17.166307771 +0000 UTC m=+22.745810494" watchObservedRunningTime="2026-04-16 22:05:17.166765822 +0000 UTC m=+22.746268546" Apr 16 22:05:18.007758 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:18.007728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:18.007925 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:18.007728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:18.007925 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:18.007839 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:18.007925 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:18.007898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:18.007925 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:18.007728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:18.008121 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:18.007965 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:19.147874 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.147587 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:05:19.148695 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.148179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"e00c059dcb79fa85e6fa1e753201b05ffb2ddef68807e39871ef6fa49d669f24"} Apr 16 22:05:19.148695 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.148532 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:19.148695 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.148574 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:19.149353 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.148747 2575 scope.go:117] "RemoveContainer" containerID="bc1cc464b740869971374010dd65f07de016dbc697e7287485637af9076d47bf" Apr 16 22:05:19.163990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.163914 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:19.558847 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.558794 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:05:19.559309 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:19.559295 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:05:20.008191 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.008165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:20.008310 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.008165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:20.008310 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:20.008266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:20.008380 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.008165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:20.008380 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:20.008339 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:20.008458 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:20.008396 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:20.153326 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.153299 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:05:20.153775 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.153656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" event={"ID":"368f7f53-a095-41a5-b3f1-ce5057f3c97b","Type":"ContainerStarted","Data":"ea11561a8acd88404f465fd5aaaa4fa46c135c645b43d13de55489543ea07cc9"} Apr 16 22:05:20.153968 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.153943 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:20.155296 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.155274 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="021e1e380bb3cc4b0fec81956e0e8c18d85120d457ca208018394735c35acec8" exitCode=0 Apr 16 22:05:20.155392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.155363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"021e1e380bb3cc4b0fec81956e0e8c18d85120d457ca208018394735c35acec8"} Apr 16 22:05:20.155596 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.155580 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:05:20.156748 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.156064 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7xr5k" Apr 16 22:05:20.168291 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.168272 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:20.176522 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:20.176490 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" podStartSLOduration=7.350948627 podStartE2EDuration="25.176479196s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.194571958 +0000 UTC m=+1.774074664" lastFinishedPulling="2026-04-16 22:05:14.020102526 +0000 UTC m=+19.599605233" observedRunningTime="2026-04-16 22:05:20.176452879 +0000 UTC m=+25.755955606" watchObservedRunningTime="2026-04-16 22:05:20.176479196 +0000 UTC m=+25.755981917" Apr 16 22:05:21.061026 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.060807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-24qxc"] Apr 16 22:05:21.061152 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.061086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:21.061191 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:21.061172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:21.063913 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.063889 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t9hwb"] Apr 16 22:05:21.064028 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.063974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:21.064091 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:21.064070 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:21.064511 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.064479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wqzqv"] Apr 16 22:05:21.064606 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:21.064563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:21.064697 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:21.064676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:22.162463 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:22.162424 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="1c935ebc8bb269de9e94999fd7509498a75ad957b6d3c6fa9180d36b58e22598" exitCode=0 Apr 16 22:05:22.163117 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:22.162507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"1c935ebc8bb269de9e94999fd7509498a75ad957b6d3c6fa9180d36b58e22598"} Apr 16 22:05:23.007814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:23.007783 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:23.007993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:23.007885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:23.007993 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:23.007908 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:23.008133 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:23.008025 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:23.008133 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:23.008076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:23.008227 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:23.008142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:24.168198 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:24.168164 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="4bea4614ea5f4f6b6411d331c313911b24946fc8b336ecb1cdc7febc0583921f" exitCode=0 Apr 16 22:05:24.168763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:24.168222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"4bea4614ea5f4f6b6411d331c313911b24946fc8b336ecb1cdc7febc0583921f"} Apr 16 22:05:25.009187 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:25.009154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:25.009370 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:25.009263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:25.009370 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:25.009292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:25.009370 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:25.009301 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:25.009516 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:25.009368 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:25.009516 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:25.009460 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:27.007631 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.007433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:27.008076 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.007442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:27.008076 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.007712 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-24qxc" podUID="273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca" Apr 16 22:05:27.008076 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.007816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9hwb" podUID="2b9c130b-e4f0-45e6-b7d3-748a4e65b953" Apr 16 22:05:27.008076 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.007517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:27.008076 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.007942 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:05:27.284126 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.284098 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-154.ec2.internal" event="NodeReady" Apr 16 22:05:27.284286 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.284221 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:05:27.321963 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.321929 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sc92n"] Apr 16 22:05:27.349270 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.349240 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fkq5z"] Apr 16 22:05:27.349398 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.349303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.351460 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.351438 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:05:27.351460 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.351456 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:05:27.351614 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.351465 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:05:27.364688 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.364668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sc92n"] Apr 16 22:05:27.364688 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.364690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fkq5z"] Apr 16 22:05:27.364835 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.364787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.366997 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.366852 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:05:27.366997 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.366883 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:05:27.366997 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.366888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:05:27.366997 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.366917 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:05:27.434523 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.434494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc6a08-d922-4dd4-bac0-76c707d36daa-tmp-dir\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.434633 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.434545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.434633 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.434589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8ct\" (UniqueName: \"kubernetes.io/projected/18fc6a08-d922-4dd4-bac0-76c707d36daa-kube-api-access-nf8ct\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.434707 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.434652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18fc6a08-d922-4dd4-bac0-76c707d36daa-config-volume\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.535290 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18fc6a08-d922-4dd4-bac0-76c707d36daa-config-volume\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.535290 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc6a08-d922-4dd4-bac0-76c707d36daa-tmp-dir\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.535290 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.535512 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8ct\" (UniqueName: \"kubernetes.io/projected/18fc6a08-d922-4dd4-bac0-76c707d36daa-kube-api-access-nf8ct\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.535512 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.535512 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzgt\" (UniqueName: \"kubernetes.io/projected/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-kube-api-access-kzzgt\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.535512 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.535442 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:27.535689 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.535525 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:28.035504307 +0000 UTC m=+33.615007031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:27.535786 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.535762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18fc6a08-d922-4dd4-bac0-76c707d36daa-config-volume\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.544949 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.544927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8ct\" (UniqueName: \"kubernetes.io/projected/18fc6a08-d922-4dd4-bac0-76c707d36daa-kube-api-access-nf8ct\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.548339 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.548307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc6a08-d922-4dd4-bac0-76c707d36daa-tmp-dir\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:27.636104 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.636061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:27.636281 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.636137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.636281 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.636179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzgt\" (UniqueName: \"kubernetes.io/projected/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-kube-api-access-kzzgt\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.636281 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.636218 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:27.636424 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.636282 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:59.636266743 +0000 UTC m=+65.215769448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:27.636424 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.636317 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:27.636424 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.636396 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:28.136375443 +0000 UTC m=+33.715878148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:27.644001 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.643978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzgt\" (UniqueName: \"kubernetes.io/projected/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-kube-api-access-kzzgt\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:27.736916 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:27.736874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:27.737117 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.737095 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:27.737163 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.737125 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:27.737163 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.737138 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bw2q8 for pod openshift-network-diagnostics/network-check-target-t9hwb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:27.737248 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:27.737201 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8 podName:2b9c130b-e4f0-45e6-b7d3-748a4e65b953 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:59.737185356 +0000 UTC m=+65.316688065 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-bw2q8" (UniqueName: "kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8") pod "network-check-target-t9hwb" (UID: "2b9c130b-e4f0-45e6-b7d3-748a4e65b953") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:28.039335 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:28.039300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:28.039755 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.039467 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:28.039755 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.039528 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.039514299 +0000 UTC m=+34.619017009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:28.140067 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:28.140021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:28.140236 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.140132 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:28.140236 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.140202 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.140180109 +0000 UTC m=+34.719682823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:28.241176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:28.241133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:28.241341 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.241279 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:28.241389 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:28.241346 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret podName:273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca nodeName:}" failed. No retries permitted until 2026-04-16 22:06:00.241330194 +0000 UTC m=+65.820832904 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret") pod "global-pull-secret-syncer-24qxc" (UID: "273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:29.007865 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.007780 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:29.008075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.007779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:05:29.008075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.007969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:29.010338 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.010316 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:05:29.010973 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.010951 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:05:29.011099 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.010980 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:05:29.011099 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.010980 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:05:29.011099 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.010958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:05:29.011251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.011044 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:05:29.047727 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.047707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:29.048070 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:29.047812 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:29.048070 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:29.047868 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:31.047854429 +0000 UTC m=+36.627357139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:29.148543 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:29.148459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:29.148714 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:29.148577 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:29.148714 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:29.148646 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:31.14862523 +0000 UTC m=+36.728127969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:30.182280 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:30.182249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerStarted","Data":"820b4413bfde67b2b59b3c373921a414ab29c26d373268d936be77ab350cad5e"} Apr 16 22:05:31.060543 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:31.060512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:31.060682 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:31.060651 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:31.060731 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:31.060710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:35.06069604 +0000 UTC m=+40.640198745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:31.161404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:31.161372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:31.161538 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:31.161518 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:31.161587 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:31.161576 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:35.161561909 +0000 UTC m=+40.741064619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:31.188208 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:31.188180 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="820b4413bfde67b2b59b3c373921a414ab29c26d373268d936be77ab350cad5e" exitCode=0 Apr 16 22:05:31.188533 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:31.188217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"820b4413bfde67b2b59b3c373921a414ab29c26d373268d936be77ab350cad5e"} Apr 16 22:05:32.192254 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:32.192223 2575 generic.go:358] "Generic (PLEG): container finished" podID="f30d322b-dffa-40ed-b571-c4015d6c53dd" containerID="270ff2a1a42ce4438e27a290e976e5b561d8619be39407e2f16ac237429a338b" exitCode=0 Apr 16 22:05:32.192711 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:32.192268 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerDied","Data":"270ff2a1a42ce4438e27a290e976e5b561d8619be39407e2f16ac237429a338b"} Apr 16 22:05:33.197058 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:33.197004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" event={"ID":"f30d322b-dffa-40ed-b571-c4015d6c53dd","Type":"ContainerStarted","Data":"c67df4cb70a2bb6e36f9de427a2956102578c2691a20909d8845fa08dc4cf3ca"} Apr 16 22:05:33.217245 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:33.217199 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7p5pj" podStartSLOduration=4.451889562 podStartE2EDuration="38.217186945s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:04:56.259091017 +0000 UTC m=+1.838593722" lastFinishedPulling="2026-04-16 22:05:30.024388395 +0000 UTC m=+35.603891105" observedRunningTime="2026-04-16 22:05:33.216125003 +0000 UTC m=+38.795627729" watchObservedRunningTime="2026-04-16 22:05:33.217186945 +0000 UTC m=+38.796689672" Apr 16 22:05:35.086951 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:35.086915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:35.087376 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:35.087065 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:35.087376 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:35.087124 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:43.08710987 +0000 UTC m=+48.666612576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:35.187744 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:35.187680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:35.187900 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:35.187831 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:35.187900 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:35.187892 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:43.187877161 +0000 UTC m=+48.767379867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:43.145223 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:43.145181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:43.145601 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:43.145334 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:43.145601 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:43.145392 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:05:59.145376601 +0000 UTC m=+64.724879305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:43.245625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:43.245585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:43.245773 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:43.245729 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:43.245829 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:43.245786 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:05:59.245771087 +0000 UTC m=+64.825273792 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:52.179371 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:52.179344 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4tdqb" Apr 16 22:05:59.152546 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.152513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:05:59.152902 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.152684 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:59.152902 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.152764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:06:31.152742455 +0000 UTC m=+96.732245182 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:05:59.253077 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.253038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:05:59.253241 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.253152 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:59.253241 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.253222 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:06:31.253205572 +0000 UTC m=+96.832708279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:05:59.654868 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.654835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:05:59.657064 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.657049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:05:59.665657 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.665640 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:05:59.665713 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:05:59.665691 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:03.665676834 +0000 UTC m=+129.245179539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : secret "metrics-daemon-secret" not found Apr 16 22:05:59.755694 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.755670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:59.757954 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.757936 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:05:59.768327 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.768309 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:05:59.779986 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.779958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2q8\" (UniqueName: \"kubernetes.io/projected/2b9c130b-e4f0-45e6-b7d3-748a4e65b953-kube-api-access-bw2q8\") pod \"network-check-target-t9hwb\" (UID: \"2b9c130b-e4f0-45e6-b7d3-748a4e65b953\") " pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:05:59.934131 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.934073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:05:59.942672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:05:59.942655 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:06:00.061207 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.061175 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t9hwb"] Apr 16 22:06:00.065083 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:06:00.065047 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9c130b_e4f0_45e6_b7d3_748a4e65b953.slice/crio-c8606191d4d5d1e82b6217428709d16bdc8df0467a2e5a4ef739c66b1098f6eb WatchSource:0}: Error finding container c8606191d4d5d1e82b6217428709d16bdc8df0467a2e5a4ef739c66b1098f6eb: Status 404 returned error can't find the container with id c8606191d4d5d1e82b6217428709d16bdc8df0467a2e5a4ef739c66b1098f6eb Apr 16 22:06:00.243417 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.243390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t9hwb" event={"ID":"2b9c130b-e4f0-45e6-b7d3-748a4e65b953","Type":"ContainerStarted","Data":"c8606191d4d5d1e82b6217428709d16bdc8df0467a2e5a4ef739c66b1098f6eb"} Apr 16 22:06:00.258778 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.258759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:06:00.261311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.261295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:06:00.271573 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.271540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca-original-pull-secret\") pod \"global-pull-secret-syncer-24qxc\" (UID: \"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca\") " pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:06:00.527089 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.527001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-24qxc" Apr 16 22:06:00.669621 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:00.669587 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-24qxc"] Apr 16 22:06:00.673644 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:06:00.673610 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273e4bd6_9bb1_4c92_b9b0_206b2e8c7fca.slice/crio-24dcf22d59bc617bd23a0861f2ddc10ea41b4faf02a55fa5be9257c6004590e9 WatchSource:0}: Error finding container 24dcf22d59bc617bd23a0861f2ddc10ea41b4faf02a55fa5be9257c6004590e9: Status 404 returned error can't find the container with id 24dcf22d59bc617bd23a0861f2ddc10ea41b4faf02a55fa5be9257c6004590e9 Apr 16 22:06:01.246711 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:01.246673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-24qxc" event={"ID":"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca","Type":"ContainerStarted","Data":"24dcf22d59bc617bd23a0861f2ddc10ea41b4faf02a55fa5be9257c6004590e9"} Apr 16 22:06:03.251839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:03.251786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t9hwb" event={"ID":"2b9c130b-e4f0-45e6-b7d3-748a4e65b953","Type":"ContainerStarted","Data":"a3abc7e0e3d95c97dc622af78efa6adf01d7cf9bac5327636d44f1d7840d12ac"} Apr 16 22:06:03.252375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:03.252211 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:06:03.269162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:03.269115 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t9hwb" podStartSLOduration=65.408130571 podStartE2EDuration="1m8.269097726s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:06:00.066872262 +0000 UTC m=+65.646374967" lastFinishedPulling="2026-04-16 22:06:02.927839404 +0000 UTC m=+68.507342122" observedRunningTime="2026-04-16 22:06:03.268610095 +0000 UTC m=+68.848112833" watchObservedRunningTime="2026-04-16 22:06:03.269097726 +0000 UTC m=+68.848600455" Apr 16 22:06:05.257585 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:05.257548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-24qxc" event={"ID":"273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca","Type":"ContainerStarted","Data":"17eb97bf9ca0879a75a246bb7d2f773e5cfa2e1bf1111ef4b8c16e80b866fd74"} Apr 16 22:06:05.271863 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:05.271821 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-24qxc" podStartSLOduration=65.162420879 podStartE2EDuration="1m9.27180812s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:06:00.675527373 +0000 UTC m=+66.255030079" lastFinishedPulling="2026-04-16 22:06:04.784914612 +0000 UTC m=+70.364417320" observedRunningTime="2026-04-16 22:06:05.271803827 +0000 UTC m=+70.851306566" watchObservedRunningTime="2026-04-16 22:06:05.27180812 +0000 UTC m=+70.851310826" Apr 16 22:06:31.163657 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:31.163624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:06:31.164032 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:06:31.163716 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:06:31.164032 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:06:31.163767 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls podName:18fc6a08-d922-4dd4-bac0-76c707d36daa nodeName:}" failed. No retries permitted until 2026-04-16 22:07:35.163753388 +0000 UTC m=+160.743256093 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls") pod "dns-default-sc92n" (UID: "18fc6a08-d922-4dd4-bac0-76c707d36daa") : secret "dns-default-metrics-tls" not found Apr 16 22:06:31.264631 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:31.264605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:06:31.264708 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:06:31.264693 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:06:31.264751 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:06:31.264743 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert podName:3cb4878c-59b3-48d2-8c2e-646f1605bf4e nodeName:}" failed. No retries permitted until 2026-04-16 22:07:35.264730953 +0000 UTC m=+160.844233659 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert") pod "ingress-canary-fkq5z" (UID: "3cb4878c-59b3-48d2-8c2e-646f1605bf4e") : secret "canary-serving-cert" not found Apr 16 22:06:35.259795 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:06:35.259764 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t9hwb" Apr 16 22:07:01.140625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.140590 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b"] Apr 16 22:07:01.143228 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.143212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.145844 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.145823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 22:07:01.146824 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.146804 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:07:01.146938 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.146827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:07:01.147696 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.147677 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-n2fp9\"" Apr 16 22:07:01.147792 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.147733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 22:07:01.152465 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.152446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b"] Apr 16 22:07:01.269359 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.269329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.269484 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.269366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2srw\" (UniqueName: \"kubernetes.io/projected/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-kube-api-access-q2srw\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.269484 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.269439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.370119 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.370088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.370269 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.370124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2srw\" (UniqueName: \"kubernetes.io/projected/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-kube-api-access-q2srw\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.370332 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:01.370280 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:01.370380 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:01.370343 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:01.87032981 +0000 UTC m=+127.449832519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:01.370380 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.370360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.371029 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.370993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.377856 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.377837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2srw\" (UniqueName: \"kubernetes.io/projected/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-kube-api-access-q2srw\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.874102 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:01.874054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:01.874279 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:01.874179 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:01.874279 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:01.874235 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:02.874220359 +0000 UTC m=+128.453723063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:02.073461 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.073428 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4"] Apr 16 22:07:02.076358 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.076336 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.078518 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.078496 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8kdcv\"" Apr 16 22:07:02.078611 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.078528 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:02.079296 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.079278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 22:07:02.079410 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.079332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 22:07:02.085285 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.085267 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4"] Apr 16 22:07:02.176971 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.176878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.176971 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.176947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mksm\" (UniqueName: \"kubernetes.io/projected/6649c15f-7949-4da9-9ae8-6488f9362044-kube-api-access-4mksm\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.278161 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.278131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.278237 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.278178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mksm\" (UniqueName: \"kubernetes.io/projected/6649c15f-7949-4da9-9ae8-6488f9362044-kube-api-access-4mksm\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.278288 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.278269 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:07:02.278349 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.278338 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls podName:6649c15f-7949-4da9-9ae8-6488f9362044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:02.778323505 +0000 UTC m=+128.357826214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4hzb4" (UID: "6649c15f-7949-4da9-9ae8-6488f9362044") : secret "samples-operator-tls" not found Apr 16 22:07:02.288719 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.288690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mksm\" (UniqueName: \"kubernetes.io/projected/6649c15f-7949-4da9-9ae8-6488f9362044-kube-api-access-4mksm\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.780205 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.780175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:02.780351 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.780314 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:07:02.780390 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.780378 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls podName:6649c15f-7949-4da9-9ae8-6488f9362044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:03.780363302 +0000 UTC m=+129.359866007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4hzb4" (UID: "6649c15f-7949-4da9-9ae8-6488f9362044") : secret "samples-operator-tls" not found Apr 16 22:07:02.880520 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:02.880494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:02.880621 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.880601 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:02.880669 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:02.880660 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:04.880646449 +0000 UTC m=+130.460149154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:03.687272 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:03.687225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:07:03.687634 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:03.687359 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:07:03.687634 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:03.687415 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs podName:ef0e5fb7-90e1-4234-a572-2eeac57ba8d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:09:05.687400965 +0000 UTC m=+251.266903670 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs") pod "network-metrics-daemon-wqzqv" (UID: "ef0e5fb7-90e1-4234-a572-2eeac57ba8d9") : secret "metrics-daemon-secret" not found Apr 16 22:07:03.787539 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:03.787510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:03.787685 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:03.787614 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:07:03.787685 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:03.787664 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls podName:6649c15f-7949-4da9-9ae8-6488f9362044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:05.787651105 +0000 UTC m=+131.367153810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4hzb4" (UID: "6649c15f-7949-4da9-9ae8-6488f9362044") : secret "samples-operator-tls" not found Apr 16 22:07:04.896649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:04.896613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:04.897001 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:04.896758 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:04.897001 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:04.896819 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:08.896805266 +0000 UTC m=+134.476307972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:05.552533 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.552498 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:07:05.555313 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.555297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.557671 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.557639 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:07:05.557671 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.557654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:07:05.557844 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.557680 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6952s\"" Apr 16 22:07:05.557934 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.557920 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:07:05.562300 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.562278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:07:05.565088 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.565066 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:07:05.703175 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g855\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703447 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703447 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.703447 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.703409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.803767 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.803699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.803767 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.803750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.803781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.803817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:05.803831 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:05.803846 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-758b7d7d5d-mz6gt: secret "image-registry-tls" not found Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.803856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:05.803901 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls podName:3ffd8efd-9ec4-439f-9898-b02d42e45549 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:06.30388672 +0000 UTC m=+131.883389425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls") pod "image-registry-758b7d7d5d-mz6gt" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549") : secret "image-registry-tls" not found Apr 16 22:07:05.803935 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:05.803903 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:07:05.804299 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:05.803965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls podName:6649c15f-7949-4da9-9ae8-6488f9362044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:09.803949698 +0000 UTC m=+135.383452403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4hzb4" (UID: "6649c15f-7949-4da9-9ae8-6488f9362044") : secret "samples-operator-tls" not found Apr 16 22:07:05.804299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g855\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804577 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804734 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.804930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.804915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.806727 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.806706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.806727 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.806715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.814246 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.814225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:05.814417 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:05.814399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g855\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:06.307872 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:06.307837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:06.308321 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:06.307994 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:07:06.308321 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:06.308035 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-758b7d7d5d-mz6gt: secret "image-registry-tls" not found Apr 16 22:07:06.308321 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:06.308105 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls podName:3ffd8efd-9ec4-439f-9898-b02d42e45549 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:07.308089977 +0000 UTC m=+132.887592681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls") pod "image-registry-758b7d7d5d-mz6gt" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549") : secret "image-registry-tls" not found Apr 16 22:07:07.314522 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:07.314489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:07.314866 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:07.314608 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:07:07.314866 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:07.314619 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-758b7d7d5d-mz6gt: secret "image-registry-tls" not found Apr 16 22:07:07.314866 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:07.314670 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls podName:3ffd8efd-9ec4-439f-9898-b02d42e45549 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:09.314657414 +0000 UTC m=+134.894160118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls") pod "image-registry-758b7d7d5d-mz6gt" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549") : secret "image-registry-tls" not found Apr 16 22:07:07.810871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:07.810840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w2p9p_4ad8b845-f3d7-4afe-a815-787bb7f69564/dns-node-resolver/0.log" Apr 16 22:07:08.610133 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:08.610104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ffv5n_4a1cd3c5-4d03-444e-82c3-29cdb850d6cf/node-ca/0.log" Apr 16 22:07:08.925850 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:08.925750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:08.925993 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:08.925894 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:08.925993 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:08.925949 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:16.925935986 +0000 UTC m=+142.505438691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:09.329336 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:09.329300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:09.329494 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:09.329438 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:07:09.329494 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:09.329459 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-758b7d7d5d-mz6gt: secret "image-registry-tls" not found Apr 16 22:07:09.329574 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:09.329510 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls podName:3ffd8efd-9ec4-439f-9898-b02d42e45549 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:13.329494678 +0000 UTC m=+138.908997385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls") pod "image-registry-758b7d7d5d-mz6gt" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549") : secret "image-registry-tls" not found Apr 16 22:07:09.833871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:09.833838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:09.834314 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:09.833970 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:07:09.834314 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:09.834061 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls podName:6649c15f-7949-4da9-9ae8-6488f9362044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:17.83404282 +0000 UTC m=+143.413545525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4hzb4" (UID: "6649c15f-7949-4da9-9ae8-6488f9362044") : secret "samples-operator-tls" not found Apr 16 22:07:10.746120 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.746091 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp"] Apr 16 22:07:10.750025 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.749993 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" Apr 16 22:07:10.752478 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.752460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-c9zwq\"" Apr 16 22:07:10.756627 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.756475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp"] Apr 16 22:07:10.840519 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.840496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4jf7\" (UniqueName: \"kubernetes.io/projected/d1098ef2-30c0-4710-89a1-5b5ff87e6fa7-kube-api-access-h4jf7\") pod \"network-check-source-8894fc9bd-dbcsp\" (UID: \"d1098ef2-30c0-4710-89a1-5b5ff87e6fa7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" Apr 16 22:07:10.941175 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.941137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4jf7\" (UniqueName: \"kubernetes.io/projected/d1098ef2-30c0-4710-89a1-5b5ff87e6fa7-kube-api-access-h4jf7\") pod \"network-check-source-8894fc9bd-dbcsp\" (UID: \"d1098ef2-30c0-4710-89a1-5b5ff87e6fa7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" Apr 16 22:07:10.948839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:10.948814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4jf7\" (UniqueName: \"kubernetes.io/projected/d1098ef2-30c0-4710-89a1-5b5ff87e6fa7-kube-api-access-h4jf7\") pod \"network-check-source-8894fc9bd-dbcsp\" (UID: \"d1098ef2-30c0-4710-89a1-5b5ff87e6fa7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" Apr 16 22:07:11.059030 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:11.058955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" Apr 16 22:07:11.166593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:11.166565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp"] Apr 16 22:07:11.169918 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:11.169893 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1098ef2_30c0_4710_89a1_5b5ff87e6fa7.slice/crio-74ec7ebc3b71414a95ce0875c163d3245b209ef815461d6f647b8b79d3b146bb WatchSource:0}: Error finding container 74ec7ebc3b71414a95ce0875c163d3245b209ef815461d6f647b8b79d3b146bb: Status 404 returned error can't find the container with id 74ec7ebc3b71414a95ce0875c163d3245b209ef815461d6f647b8b79d3b146bb Apr 16 22:07:11.380106 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:11.380031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" event={"ID":"d1098ef2-30c0-4710-89a1-5b5ff87e6fa7","Type":"ContainerStarted","Data":"7273f3e320bcd52d266d7ec35fded5465789920f27639e0e6f1086eb29d39707"} Apr 16 22:07:11.380106 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:11.380067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" event={"ID":"d1098ef2-30c0-4710-89a1-5b5ff87e6fa7","Type":"ContainerStarted","Data":"74ec7ebc3b71414a95ce0875c163d3245b209ef815461d6f647b8b79d3b146bb"} Apr 16 22:07:11.394267 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:11.394228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dbcsp" podStartSLOduration=1.394215917 podStartE2EDuration="1.394215917s" podCreationTimestamp="2026-04-16 22:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:11.392909938 +0000 UTC m=+136.972412666" watchObservedRunningTime="2026-04-16 22:07:11.394215917 +0000 UTC m=+136.973718643" Apr 16 22:07:12.236775 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.236742 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp"] Apr 16 22:07:12.239981 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.239965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" Apr 16 22:07:12.242000 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.241975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-w8ntz\"" Apr 16 22:07:12.242100 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.242005 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 22:07:12.242735 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.242713 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:12.245396 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.245374 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp"] Apr 16 22:07:12.351322 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.351294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvmj\" (UniqueName: \"kubernetes.io/projected/c5ff0589-676f-413b-9ec7-397666bad579-kube-api-access-zwvmj\") pod \"migrator-74bb7799d9-zp9rp\" (UID: \"c5ff0589-676f-413b-9ec7-397666bad579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" Apr 16 22:07:12.451599 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.451571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvmj\" (UniqueName: \"kubernetes.io/projected/c5ff0589-676f-413b-9ec7-397666bad579-kube-api-access-zwvmj\") pod \"migrator-74bb7799d9-zp9rp\" (UID: \"c5ff0589-676f-413b-9ec7-397666bad579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" Apr 16 22:07:12.458979 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.458956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvmj\" (UniqueName: \"kubernetes.io/projected/c5ff0589-676f-413b-9ec7-397666bad579-kube-api-access-zwvmj\") pod \"migrator-74bb7799d9-zp9rp\" (UID: \"c5ff0589-676f-413b-9ec7-397666bad579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" Apr 16 22:07:12.550770 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.550720 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" Apr 16 22:07:12.660988 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:12.660961 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp"] Apr 16 22:07:12.663632 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:12.663597 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ff0589_676f_413b_9ec7_397666bad579.slice/crio-0a0ea8f5a9020f84ff243b422f47684fd9a6efffe1d816aee133bc77d05c71fb WatchSource:0}: Error finding container 0a0ea8f5a9020f84ff243b422f47684fd9a6efffe1d816aee133bc77d05c71fb: Status 404 returned error can't find the container with id 0a0ea8f5a9020f84ff243b422f47684fd9a6efffe1d816aee133bc77d05c71fb Apr 16 22:07:13.359178 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:13.359142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:13.359606 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:13.359328 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:07:13.359606 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:13.359354 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-758b7d7d5d-mz6gt: secret "image-registry-tls" not found Apr 16 22:07:13.359606 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:13.359425 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls podName:3ffd8efd-9ec4-439f-9898-b02d42e45549 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:21.359402671 +0000 UTC m=+146.938905379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls") pod "image-registry-758b7d7d5d-mz6gt" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549") : secret "image-registry-tls" not found Apr 16 22:07:13.384610 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:13.384563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" event={"ID":"c5ff0589-676f-413b-9ec7-397666bad579","Type":"ContainerStarted","Data":"0a0ea8f5a9020f84ff243b422f47684fd9a6efffe1d816aee133bc77d05c71fb"} Apr 16 22:07:14.388306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:14.388270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" event={"ID":"c5ff0589-676f-413b-9ec7-397666bad579","Type":"ContainerStarted","Data":"460751df06b1aa7351eda1bacf7aa9e1d86891bc22f5d01d7335b43139b5e648"} Apr 16 22:07:14.388667 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:14.388313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" event={"ID":"c5ff0589-676f-413b-9ec7-397666bad579","Type":"ContainerStarted","Data":"46854f2419bb373f939544df44374cead1c6b5084336e116280bf50ef6155b2c"} Apr 16 22:07:14.402195 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:14.402146 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zp9rp" podStartSLOduration=1.213604166 podStartE2EDuration="2.402132686s" podCreationTimestamp="2026-04-16 22:07:12 +0000 UTC" firstStartedPulling="2026-04-16 22:07:12.665385389 +0000 UTC m=+138.244888094" lastFinishedPulling="2026-04-16 22:07:13.853913897 +0000 UTC m=+139.433416614" observedRunningTime="2026-04-16 22:07:14.400545221 +0000 UTC m=+139.980047949" watchObservedRunningTime="2026-04-16 22:07:14.402132686 +0000 UTC m=+139.981635439" Apr 16 22:07:16.986878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:16.986833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:16.987279 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:16.986964 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:16.987279 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:16.987052 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls podName:8aa498c8-4e70-44a5-8cf3-8c5794a14bc9 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:32.987033948 +0000 UTC m=+158.566536655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-xbj2b" (UID: "8aa498c8-4e70-44a5-8cf3-8c5794a14bc9") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:07:17.895944 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:17.895888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:17.898249 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:17.898228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6649c15f-7949-4da9-9ae8-6488f9362044-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4hzb4\" (UID: \"6649c15f-7949-4da9-9ae8-6488f9362044\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:17.986608 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:17.986566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" Apr 16 22:07:18.095301 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:18.095253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4"] Apr 16 22:07:18.397043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:18.396991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" event={"ID":"6649c15f-7949-4da9-9ae8-6488f9362044","Type":"ContainerStarted","Data":"f4208fa409c519f081ed5236028053f33c3f116b29c14109a4cc28580131801e"} Apr 16 22:07:20.402874 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:20.402841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" event={"ID":"6649c15f-7949-4da9-9ae8-6488f9362044","Type":"ContainerStarted","Data":"cd03b30f80f2f58fd01a7f540afc1e264924a18a8158c29508c55b038480ceec"} Apr 16 22:07:20.403241 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:20.402882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" event={"ID":"6649c15f-7949-4da9-9ae8-6488f9362044","Type":"ContainerStarted","Data":"8e68b63d2cc4040852c5bb12b63769a5c143579d80a5a3ed63ec15fe9f551db9"} Apr 16 22:07:20.417904 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:20.417864 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4hzb4" podStartSLOduration=17.049728847 podStartE2EDuration="18.417848583s" podCreationTimestamp="2026-04-16 22:07:02 +0000 UTC" firstStartedPulling="2026-04-16 22:07:18.137204483 +0000 UTC m=+143.716707188" lastFinishedPulling="2026-04-16 22:07:19.505324215 +0000 UTC m=+145.084826924" observedRunningTime="2026-04-16 22:07:20.417263999 +0000 UTC m=+145.996766727" watchObservedRunningTime="2026-04-16 22:07:20.417848583 +0000 UTC m=+145.997351313" Apr 16 22:07:21.418025 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:21.417977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:21.420441 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:21.420417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"image-registry-758b7d7d5d-mz6gt\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:21.465451 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:21.465426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:21.575339 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:21.575310 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:07:21.578802 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:21.578771 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ffd8efd_9ec4_439f_9898_b02d42e45549.slice/crio-860854993c3db90fdaf2e6f9d815f46f2b7e50c8a2f1a367a3defb823684d83b WatchSource:0}: Error finding container 860854993c3db90fdaf2e6f9d815f46f2b7e50c8a2f1a367a3defb823684d83b: Status 404 returned error can't find the container with id 860854993c3db90fdaf2e6f9d815f46f2b7e50c8a2f1a367a3defb823684d83b Apr 16 22:07:22.410271 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:22.410238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" event={"ID":"3ffd8efd-9ec4-439f-9898-b02d42e45549","Type":"ContainerStarted","Data":"db4a52d424c5a0be576e13c525972a4cf6e904314f28201e75bc64e201ab0f49"} Apr 16 22:07:22.410271 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:22.410274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" event={"ID":"3ffd8efd-9ec4-439f-9898-b02d42e45549","Type":"ContainerStarted","Data":"860854993c3db90fdaf2e6f9d815f46f2b7e50c8a2f1a367a3defb823684d83b"} Apr 16 22:07:22.410469 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:22.410346 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:22.427173 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:22.427130 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" podStartSLOduration=17.427118187 podStartE2EDuration="17.427118187s" podCreationTimestamp="2026-04-16 22:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:22.426000068 +0000 UTC m=+148.005502796" watchObservedRunningTime="2026-04-16 22:07:22.427118187 +0000 UTC m=+148.006620913" Apr 16 22:07:30.359480 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:30.359423 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sc92n" podUID="18fc6a08-d922-4dd4-bac0-76c707d36daa" Apr 16 22:07:30.374699 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:30.374663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fkq5z" podUID="3cb4878c-59b3-48d2-8c2e-646f1605bf4e" Apr 16 22:07:30.429387 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:30.429354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:32.021071 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:32.021004 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wqzqv" podUID="ef0e5fb7-90e1-4234-a572-2eeac57ba8d9" Apr 16 22:07:33.001864 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:33.001817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:33.004211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:33.004180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aa498c8-4e70-44a5-8cf3-8c5794a14bc9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-xbj2b\" (UID: \"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:33.251644 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:33.251608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" Apr 16 22:07:33.363718 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:33.363682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b"] Apr 16 22:07:33.367004 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:33.366972 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa498c8_4e70_44a5_8cf3_8c5794a14bc9.slice/crio-292ce2d9b240ca16cba64143b691699b40faf4da79d04e0de5c4f533e8ec32ab WatchSource:0}: Error finding container 292ce2d9b240ca16cba64143b691699b40faf4da79d04e0de5c4f533e8ec32ab: Status 404 returned error can't find the container with id 292ce2d9b240ca16cba64143b691699b40faf4da79d04e0de5c4f533e8ec32ab Apr 16 22:07:33.438929 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:33.438893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" event={"ID":"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9","Type":"ContainerStarted","Data":"292ce2d9b240ca16cba64143b691699b40faf4da79d04e0de5c4f533e8ec32ab"} Apr 16 22:07:35.220587 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.220503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:35.223596 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.223570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18fc6a08-d922-4dd4-bac0-76c707d36daa-metrics-tls\") pod \"dns-default-sc92n\" (UID: \"18fc6a08-d922-4dd4-bac0-76c707d36daa\") " pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:35.231638 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.231609 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:07:35.239729 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.239708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:35.321452 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.321426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:07:35.324226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.324203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3cb4878c-59b3-48d2-8c2e-646f1605bf4e-cert\") pod \"ingress-canary-fkq5z\" (UID: \"3cb4878c-59b3-48d2-8c2e-646f1605bf4e\") " pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:07:35.354862 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.354830 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sc92n"] Apr 16 22:07:35.357966 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:35.357941 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fc6a08_d922_4dd4_bac0_76c707d36daa.slice/crio-332c65015107ee0cf75c2b3c16ffd2722e1c2ead38127f9cf37085287ce48677 WatchSource:0}: Error finding container 332c65015107ee0cf75c2b3c16ffd2722e1c2ead38127f9cf37085287ce48677: Status 404 returned error can't find the container with id 332c65015107ee0cf75c2b3c16ffd2722e1c2ead38127f9cf37085287ce48677 Apr 16 22:07:35.445001 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.444969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" event={"ID":"8aa498c8-4e70-44a5-8cf3-8c5794a14bc9","Type":"ContainerStarted","Data":"4c92964852fefdf146f98e45e1678981e53ade0851774d6eadc4c7c2cd510ec9"} Apr 16 22:07:35.446075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.446050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc92n" event={"ID":"18fc6a08-d922-4dd4-bac0-76c707d36daa","Type":"ContainerStarted","Data":"332c65015107ee0cf75c2b3c16ffd2722e1c2ead38127f9cf37085287ce48677"} Apr 16 22:07:35.459081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:35.459040 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-xbj2b" podStartSLOduration=32.877204412 podStartE2EDuration="34.45902655s" podCreationTimestamp="2026-04-16 22:07:01 +0000 UTC" firstStartedPulling="2026-04-16 22:07:33.368815087 +0000 UTC m=+158.948317792" lastFinishedPulling="2026-04-16 22:07:34.950637209 +0000 UTC m=+160.530139930" observedRunningTime="2026-04-16 22:07:35.458443895 +0000 UTC m=+161.037946622" watchObservedRunningTime="2026-04-16 22:07:35.45902655 +0000 UTC m=+161.038529274" Apr 16 22:07:36.324318 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.324291 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tjq9f"] Apr 16 22:07:36.327747 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.327725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.329969 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.329941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:07:36.330827 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.330781 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:07:36.330827 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.330815 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wk4hv\"" Apr 16 22:07:36.330973 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.330826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:07:36.331166 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.331151 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:07:36.336780 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.336760 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tjq9f"] Apr 16 22:07:36.352669 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.352648 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:07:36.357936 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.357907 2575 patch_prober.go:28] interesting pod/image-registry-758b7d7d5d-mz6gt container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:07:36.358073 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.357969 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:07:36.410814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.410782 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb"] Apr 16 22:07:36.413822 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.413795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:36.416613 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.416592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7mrss\"" Apr 16 22:07:36.416762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.416744 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 22:07:36.430853 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.430829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.430955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.430874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.431030 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.430968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-crio-socket\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.431098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.431033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-data-volume\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.431156 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.431100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qbdl\" (UniqueName: \"kubernetes.io/projected/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-api-access-6qbdl\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.436989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.436952 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb"] Apr 16 22:07:36.532045 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.531989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-data-volume\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532231 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532231 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532231 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-crio-socket\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532231 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qbdl\" (UniqueName: \"kubernetes.io/projected/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-api-access-6qbdl\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532231 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/481c641d-358b-4737-befb-5b91970311c7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-4trnb\" (UID: \"481c641d-358b-4737-befb-5b91970311c7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:36.532469 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-crio-socket\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.532469 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.532400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-data-volume\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.533392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.533369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.534573 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.534550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.544187 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.544166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qbdl\" (UniqueName: \"kubernetes.io/projected/912f768c-37c1-4dce-b9e2-2d8ce1263ffc-kube-api-access-6qbdl\") pod \"insights-runtime-extractor-tjq9f\" (UID: \"912f768c-37c1-4dce-b9e2-2d8ce1263ffc\") " pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.633257 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.633169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/481c641d-358b-4737-befb-5b91970311c7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-4trnb\" (UID: \"481c641d-358b-4737-befb-5b91970311c7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:36.636766 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.636722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/481c641d-358b-4737-befb-5b91970311c7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-4trnb\" (UID: \"481c641d-358b-4737-befb-5b91970311c7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:36.638423 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.638400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tjq9f" Apr 16 22:07:36.726162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.724376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:36.803115 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.803073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tjq9f"] Apr 16 22:07:36.807195 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:36.807148 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912f768c_37c1_4dce_b9e2_2d8ce1263ffc.slice/crio-93393f76daceebe1b2d889e168ac33918f03283deab52e5846d8e45a36bbd822 WatchSource:0}: Error finding container 93393f76daceebe1b2d889e168ac33918f03283deab52e5846d8e45a36bbd822: Status 404 returned error can't find the container with id 93393f76daceebe1b2d889e168ac33918f03283deab52e5846d8e45a36bbd822 Apr 16 22:07:36.859053 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:36.858875 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb"] Apr 16 22:07:36.863392 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:36.863360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481c641d_358b_4737_befb_5b91970311c7.slice/crio-c801a829414cb63d0c2601cef742f53548de276bb141b477e176cdbc32f480bf WatchSource:0}: Error finding container c801a829414cb63d0c2601cef742f53548de276bb141b477e176cdbc32f480bf: Status 404 returned error can't find the container with id c801a829414cb63d0c2601cef742f53548de276bb141b477e176cdbc32f480bf Apr 16 22:07:37.452394 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.452343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" event={"ID":"481c641d-358b-4737-befb-5b91970311c7","Type":"ContainerStarted","Data":"c801a829414cb63d0c2601cef742f53548de276bb141b477e176cdbc32f480bf"} Apr 16 22:07:37.453765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.453732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjq9f" event={"ID":"912f768c-37c1-4dce-b9e2-2d8ce1263ffc","Type":"ContainerStarted","Data":"64ae18523b738df605f1218fd98a7c9303b54c90cc5c791f2e8d6ee4a707400a"} Apr 16 22:07:37.453889 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.453765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjq9f" event={"ID":"912f768c-37c1-4dce-b9e2-2d8ce1263ffc","Type":"ContainerStarted","Data":"93393f76daceebe1b2d889e168ac33918f03283deab52e5846d8e45a36bbd822"} Apr 16 22:07:37.455533 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.455511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc92n" event={"ID":"18fc6a08-d922-4dd4-bac0-76c707d36daa","Type":"ContainerStarted","Data":"a10101fc86d86e6addaf25bff03377cabc4d32a07b3095ec5354fe8fbc2df6c1"} Apr 16 22:07:37.455625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.455538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sc92n" event={"ID":"18fc6a08-d922-4dd4-bac0-76c707d36daa","Type":"ContainerStarted","Data":"39edf40b2b712624febedb16eb3336e1284e309b32c57582e9f478bb651946ec"} Apr 16 22:07:37.455706 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.455687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:37.477536 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:37.477496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sc92n" podStartSLOduration=129.136598068 podStartE2EDuration="2m10.477483171s" podCreationTimestamp="2026-04-16 22:05:27 +0000 UTC" firstStartedPulling="2026-04-16 22:07:35.360215134 +0000 UTC m=+160.939717839" lastFinishedPulling="2026-04-16 22:07:36.701100222 +0000 UTC m=+162.280602942" observedRunningTime="2026-04-16 22:07:37.476487205 +0000 UTC m=+163.055989934" watchObservedRunningTime="2026-04-16 22:07:37.477483171 +0000 UTC m=+163.056985897" Apr 16 22:07:38.459557 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:38.459513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjq9f" event={"ID":"912f768c-37c1-4dce-b9e2-2d8ce1263ffc","Type":"ContainerStarted","Data":"052ad83b5425ab777dcd647597e6e0afa5f7de9c69757ae83583b1359bd83be7"} Apr 16 22:07:38.460903 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:38.460874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" event={"ID":"481c641d-358b-4737-befb-5b91970311c7","Type":"ContainerStarted","Data":"8e12ed1e889cb2a7c44810b2bbec2489dbeee9107a3e8bbb3ba6dee7f1a2c56b"} Apr 16 22:07:38.474210 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:38.474159 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" podStartSLOduration=1.471535665 podStartE2EDuration="2.474142633s" podCreationTimestamp="2026-04-16 22:07:36 +0000 UTC" firstStartedPulling="2026-04-16 22:07:36.8656538 +0000 UTC m=+162.445156505" lastFinishedPulling="2026-04-16 22:07:37.868260765 +0000 UTC m=+163.447763473" observedRunningTime="2026-04-16 22:07:38.473712899 +0000 UTC m=+164.053215664" watchObservedRunningTime="2026-04-16 22:07:38.474142633 +0000 UTC m=+164.053645361" Apr 16 22:07:39.465781 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:39.465742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjq9f" event={"ID":"912f768c-37c1-4dce-b9e2-2d8ce1263ffc","Type":"ContainerStarted","Data":"1882447d11a77c76731c272d8755a91b7c816f5a83cde2a44ddbdef04663b71d"} Apr 16 22:07:39.466228 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:39.465949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:39.473851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:39.473826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-4trnb" Apr 16 22:07:39.484116 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:39.484073 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tjq9f" podStartSLOduration=1.494946736 podStartE2EDuration="3.484062224s" podCreationTimestamp="2026-04-16 22:07:36 +0000 UTC" firstStartedPulling="2026-04-16 22:07:36.868479586 +0000 UTC m=+162.447982291" lastFinishedPulling="2026-04-16 22:07:38.857595071 +0000 UTC m=+164.437097779" observedRunningTime="2026-04-16 22:07:39.483326436 +0000 UTC m=+165.062829180" watchObservedRunningTime="2026-04-16 22:07:39.484062224 +0000 UTC m=+165.063564929" Apr 16 22:07:44.008063 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.008024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:07:44.008510 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.008034 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:07:44.010186 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.010170 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:07:44.018847 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.018828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fkq5z" Apr 16 22:07:44.128698 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.128668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fkq5z"] Apr 16 22:07:44.131813 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:44.131781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb4878c_59b3_48d2_8c2e_646f1605bf4e.slice/crio-35bda7318c82ae69e3340357792e6c614b640358ee2f097215cd386b39df1238 WatchSource:0}: Error finding container 35bda7318c82ae69e3340357792e6c614b640358ee2f097215cd386b39df1238: Status 404 returned error can't find the container with id 35bda7318c82ae69e3340357792e6c614b640358ee2f097215cd386b39df1238 Apr 16 22:07:44.480302 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.480227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fkq5z" event={"ID":"3cb4878c-59b3-48d2-8c2e-646f1605bf4e","Type":"ContainerStarted","Data":"35bda7318c82ae69e3340357792e6c614b640358ee2f097215cd386b39df1238"} Apr 16 22:07:44.828079 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.828046 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl"] Apr 16 22:07:44.831332 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.831312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.833953 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.833930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 22:07:44.834097 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.833953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:07:44.834097 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.834068 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-7zql9\"" Apr 16 22:07:44.834222 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.834206 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:07:44.841085 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.841038 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl"] Apr 16 22:07:44.846990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.846942 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dmr6l"] Apr 16 22:07:44.850339 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.850317 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.852570 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.852501 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:07:44.852818 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.852791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:07:44.852901 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.852880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:07:44.852982 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.852963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gz4kv\"" Apr 16 22:07:44.895973 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.895942 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-textfile\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896163 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-sys\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896163 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896163 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-metrics-client-ca\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896163 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/576ba2b0-0acf-4938-bae7-06f509b251ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.896163 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6wwm\" (UniqueName: \"kubernetes.io/projected/576ba2b0-0acf-4938-bae7-06f509b251ae-kube-api-access-p6wwm\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.896392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.896392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl76m\" (UniqueName: \"kubernetes.io/projected/d7c988dc-9643-4f55-9745-2403cd54fc4a-kube-api-access-hl76m\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.896392 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-root\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896641 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.896641 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.896461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-wtmp\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.997819 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.997780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-textfile\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.997819 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.997818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-sys\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.997894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-sys\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.997944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.997980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-metrics-client-ca\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/576ba2b0-0acf-4938-bae7-06f509b251ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:44.998053 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:07:44.998075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6wwm\" (UniqueName: \"kubernetes.io/projected/576ba2b0-0acf-4938-bae7-06f509b251ae-kube-api-access-p6wwm\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:44.998116 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls podName:d7c988dc-9643-4f55-9745-2403cd54fc4a nodeName:}" failed. No retries permitted until 2026-04-16 22:07:45.498095971 +0000 UTC m=+171.077598691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls") pod "node-exporter-dmr6l" (UID: "d7c988dc-9643-4f55-9745-2403cd54fc4a") : secret "node-exporter-tls" not found Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl76m\" (UniqueName: \"kubernetes.io/projected/d7c988dc-9643-4f55-9745-2403cd54fc4a-kube-api-access-hl76m\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-textfile\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998372 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-root\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-wtmp\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:44.998378 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:44.998632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls podName:576ba2b0-0acf-4938-bae7-06f509b251ae nodeName:}" failed. No retries permitted until 2026-04-16 22:07:45.498594682 +0000 UTC m=+171.078097396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-lpjcl" (UID: "576ba2b0-0acf-4938-bae7-06f509b251ae") : secret "openshift-state-metrics-tls" not found Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-metrics-client-ca\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-wtmp\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7c988dc-9643-4f55-9745-2403cd54fc4a-root\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:44.998787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.998769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/576ba2b0-0acf-4938-bae7-06f509b251ae-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:44.999213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:44.999176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.000867 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.000848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.000999 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.000978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:45.006276 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.006252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6wwm\" (UniqueName: \"kubernetes.io/projected/576ba2b0-0acf-4938-bae7-06f509b251ae-kube-api-access-p6wwm\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:45.006381 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.006333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl76m\" (UniqueName: \"kubernetes.io/projected/d7c988dc-9643-4f55-9745-2403cd54fc4a-kube-api-access-hl76m\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.503360 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.503308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:45.503833 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.503405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.506313 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.506286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7c988dc-9643-4f55-9745-2403cd54fc4a-node-exporter-tls\") pod \"node-exporter-dmr6l\" (UID: \"d7c988dc-9643-4f55-9745-2403cd54fc4a\") " pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.506450 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.506382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/576ba2b0-0acf-4938-bae7-06f509b251ae-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lpjcl\" (UID: \"576ba2b0-0acf-4938-bae7-06f509b251ae\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:45.744131 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.744087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" Apr 16 22:07:45.763064 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.762971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmr6l" Apr 16 22:07:45.772867 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:45.772828 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7c988dc_9643_4f55_9745_2403cd54fc4a.slice/crio-c030778d73e8508783e1002ecae375a4426b4ec2c0c48612787e1d5f97f21679 WatchSource:0}: Error finding container c030778d73e8508783e1002ecae375a4426b4ec2c0c48612787e1d5f97f21679: Status 404 returned error can't find the container with id c030778d73e8508783e1002ecae375a4426b4ec2c0c48612787e1d5f97f21679 Apr 16 22:07:45.871477 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.871433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl"] Apr 16 22:07:45.875245 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:45.875222 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576ba2b0_0acf_4938_bae7_06f509b251ae.slice/crio-6e9aa99f6ff0daf4ed7fb145f958ddc1786be5c6536ebf288b4850da7029269e WatchSource:0}: Error finding container 6e9aa99f6ff0daf4ed7fb145f958ddc1786be5c6536ebf288b4850da7029269e: Status 404 returned error can't find the container with id 6e9aa99f6ff0daf4ed7fb145f958ddc1786be5c6536ebf288b4850da7029269e Apr 16 22:07:45.917314 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.917290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:07:45.922633 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.922616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:45.926048 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:07:45.926215 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:07:45.926215 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:07:45.926384 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:07:45.926985 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:07:45.926985 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-74x5v\"" Apr 16 22:07:45.926985 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926865 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:07:45.926985 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.926897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:07:45.927238 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.927159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:07:45.930274 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.930248 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:07:45.933915 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:45.933893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:07:46.006218 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006458 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006458 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgpm\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006555 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006555 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006555 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006698 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006698 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006698 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.006698 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.006689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107675 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgpm\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107675 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107675 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.107955 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.108226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.107998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.108226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.108063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.108226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.108106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.108226 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:46.108202 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 22:07:46.108409 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:07:46.108267 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls podName:5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf nodeName:}" failed. No retries permitted until 2026-04-16 22:07:46.608245944 +0000 UTC m=+172.187748651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf") : secret "alertmanager-main-tls" not found Apr 16 22:07:46.109157 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.109126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.109400 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.109376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.109528 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.109502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.109594 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.109562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.109938 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.109864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.110564 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.110539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.110950 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.110906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.111137 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.111103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.111560 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.111534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.111645 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.111557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.111838 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.111817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.112217 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.112201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.112378 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.112361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.117522 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.117503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgpm\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.358309 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.358078 2575 patch_prober.go:28] interesting pod/image-registry-758b7d7d5d-mz6gt container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:07:46.358309 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.358141 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:07:46.488729 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.488686 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fkq5z" event={"ID":"3cb4878c-59b3-48d2-8c2e-646f1605bf4e","Type":"ContainerStarted","Data":"fc9d84a5b9ba2d1de08cdfc5f9dd86564416e8af7afe631585cef90b0209ed7f"} Apr 16 22:07:46.491247 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.491161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" event={"ID":"576ba2b0-0acf-4938-bae7-06f509b251ae","Type":"ContainerStarted","Data":"8c67d732926a26bca875f03bcd8ac7534118a0050d820a63102379fb2babe13d"} Apr 16 22:07:46.491247 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.491210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" event={"ID":"576ba2b0-0acf-4938-bae7-06f509b251ae","Type":"ContainerStarted","Data":"f14a284aff8898c462bb0650d7ad4112ca480a611899adaf2223abeb5381f070"} Apr 16 22:07:46.491247 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.491227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" event={"ID":"576ba2b0-0acf-4938-bae7-06f509b251ae","Type":"ContainerStarted","Data":"6e9aa99f6ff0daf4ed7fb145f958ddc1786be5c6536ebf288b4850da7029269e"} Apr 16 22:07:46.492482 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.492455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmr6l" event={"ID":"d7c988dc-9643-4f55-9745-2403cd54fc4a","Type":"ContainerStarted","Data":"c030778d73e8508783e1002ecae375a4426b4ec2c0c48612787e1d5f97f21679"} Apr 16 22:07:46.502309 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.502157 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fkq5z" podStartSLOduration=138.066072756 podStartE2EDuration="2m19.502141721s" podCreationTimestamp="2026-04-16 22:05:27 +0000 UTC" firstStartedPulling="2026-04-16 22:07:44.13354039 +0000 UTC m=+169.713043095" lastFinishedPulling="2026-04-16 22:07:45.569609351 +0000 UTC m=+171.149112060" observedRunningTime="2026-04-16 22:07:46.50180303 +0000 UTC m=+172.081305754" watchObservedRunningTime="2026-04-16 22:07:46.502141721 +0000 UTC m=+172.081644449" Apr 16 22:07:46.615106 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.614983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.617999 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.617973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:46.833396 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:46.833359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:07:47.003405 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.003339 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:07:47.008306 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:47.008225 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a11ddc3_da0b_4ce4_af87_2cec6aa5dccf.slice/crio-f40b429fb7d04ae798ec8754234b22523104763aa05411436340c8cb6ba0cbd0 WatchSource:0}: Error finding container f40b429fb7d04ae798ec8754234b22523104763aa05411436340c8cb6ba0cbd0: Status 404 returned error can't find the container with id f40b429fb7d04ae798ec8754234b22523104763aa05411436340c8cb6ba0cbd0 Apr 16 22:07:47.463553 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.463521 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sc92n" Apr 16 22:07:47.497925 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.497889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" event={"ID":"576ba2b0-0acf-4938-bae7-06f509b251ae","Type":"ContainerStarted","Data":"7978ae3d56ff4d558accb1cc53e9c694864368221eb3c97c27bce64d411086e5"} Apr 16 22:07:47.499631 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.499603 2575 generic.go:358] "Generic (PLEG): container finished" podID="d7c988dc-9643-4f55-9745-2403cd54fc4a" containerID="ccd6e7e44247c560c37c7214c01bbc83e8db899786704fc8e019b22f30626c81" exitCode=0 Apr 16 22:07:47.499762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.499680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmr6l" event={"ID":"d7c988dc-9643-4f55-9745-2403cd54fc4a","Type":"ContainerDied","Data":"ccd6e7e44247c560c37c7214c01bbc83e8db899786704fc8e019b22f30626c81"} Apr 16 22:07:47.500928 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.500875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"f40b429fb7d04ae798ec8754234b22523104763aa05411436340c8cb6ba0cbd0"} Apr 16 22:07:47.514251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.514204 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lpjcl" podStartSLOduration=2.602704284 podStartE2EDuration="3.51418723s" podCreationTimestamp="2026-04-16 22:07:44 +0000 UTC" firstStartedPulling="2026-04-16 22:07:46.002594634 +0000 UTC m=+171.582097346" lastFinishedPulling="2026-04-16 22:07:46.914077571 +0000 UTC m=+172.493580292" observedRunningTime="2026-04-16 22:07:47.513086158 +0000 UTC m=+173.092588886" watchObservedRunningTime="2026-04-16 22:07:47.51418723 +0000 UTC m=+173.093689961" Apr 16 22:07:47.934795 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.934426 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-54579f7659-wm42w"] Apr 16 22:07:47.937945 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.937925 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:47.940362 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940126 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 22:07:47.940362 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 22:07:47.940362 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 22:07:47.940362 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940242 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-dttls\"" Apr 16 22:07:47.940632 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 22:07:47.940632 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940513 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 22:07:47.940632 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.940573 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2tuanji3mimgs\"" Apr 16 22:07:47.950091 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:47.950072 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54579f7659-wm42w"] Apr 16 22:07:48.027442 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-grpc-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027545 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027601 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027644 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnx7f\" (UniqueName: \"kubernetes.io/projected/cc1e439f-b750-4821-90f6-eeb916e4509b-kube-api-access-jnx7f\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027720 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc1e439f-b750-4821-90f6-eeb916e4509b-metrics-client-ca\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027801 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.027839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.027810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.128749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.128713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc1e439f-b750-4821-90f6-eeb916e4509b-metrics-client-ca\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.128930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.128756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.128930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.128811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.128930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.128834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.128930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.128859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-grpc-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.129158 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.129027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.129158 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.129077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.129158 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.129120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnx7f\" (UniqueName: \"kubernetes.io/projected/cc1e439f-b750-4821-90f6-eeb916e4509b-kube-api-access-jnx7f\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.129593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.129566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc1e439f-b750-4821-90f6-eeb916e4509b-metrics-client-ca\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131478 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131687 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131860 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131920 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131957 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.131990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.131964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cc1e439f-b750-4821-90f6-eeb916e4509b-secret-grpc-tls\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.135725 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.135710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnx7f\" (UniqueName: \"kubernetes.io/projected/cc1e439f-b750-4821-90f6-eeb916e4509b-kube-api-access-jnx7f\") pod \"thanos-querier-54579f7659-wm42w\" (UID: \"cc1e439f-b750-4821-90f6-eeb916e4509b\") " pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.259948 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.259921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:48.374032 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.373927 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54579f7659-wm42w"] Apr 16 22:07:48.376374 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:48.376344 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1e439f_b750_4821_90f6_eeb916e4509b.slice/crio-bd22cb4a6557d7512ddaebcccd78cfccc84e5018f8508f364b4411cc8165f1df WatchSource:0}: Error finding container bd22cb4a6557d7512ddaebcccd78cfccc84e5018f8508f364b4411cc8165f1df: Status 404 returned error can't find the container with id bd22cb4a6557d7512ddaebcccd78cfccc84e5018f8508f364b4411cc8165f1df Apr 16 22:07:48.504411 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.504376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"bd22cb4a6557d7512ddaebcccd78cfccc84e5018f8508f364b4411cc8165f1df"} Apr 16 22:07:48.506183 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.506153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmr6l" event={"ID":"d7c988dc-9643-4f55-9745-2403cd54fc4a","Type":"ContainerStarted","Data":"28f97a730bda2b7afa8f6ab721fa9c7c9254c6a6bdfa073d39e6396eb5b297b8"} Apr 16 22:07:48.506183 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.506183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmr6l" event={"ID":"d7c988dc-9643-4f55-9745-2403cd54fc4a","Type":"ContainerStarted","Data":"fcd34fb40e97d0ed886fe5e4710980362037249948832868054bb08c4be6aca4"} Apr 16 22:07:48.507511 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.507488 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348" exitCode=0 Apr 16 22:07:48.507608 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.507550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348"} Apr 16 22:07:48.525891 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:48.522848 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dmr6l" podStartSLOduration=3.426799842 podStartE2EDuration="4.522832428s" podCreationTimestamp="2026-04-16 22:07:44 +0000 UTC" firstStartedPulling="2026-04-16 22:07:45.776174496 +0000 UTC m=+171.355677206" lastFinishedPulling="2026-04-16 22:07:46.872207071 +0000 UTC m=+172.451709792" observedRunningTime="2026-04-16 22:07:48.522298191 +0000 UTC m=+174.101800918" watchObservedRunningTime="2026-04-16 22:07:48.522832428 +0000 UTC m=+174.102335156" Apr 16 22:07:49.141768 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.141723 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-698654f8b9-vltwc"] Apr 16 22:07:49.145284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.145261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.147527 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.147497 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-btbwp\"" Apr 16 22:07:49.148131 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.148110 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-15mml7u2813k5\"" Apr 16 22:07:49.148301 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.148283 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 22:07:49.148385 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.148344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:07:49.148500 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.148479 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 22:07:49.148617 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.148508 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 22:07:49.152120 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.151679 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-698654f8b9-vltwc"] Apr 16 22:07:49.238420 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-metrics-server-audit-profiles\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238577 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/09dba9e2-ce5f-46b8-a09f-8fa332e68991-audit-log\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238577 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-client-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238577 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-tls\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5rm\" (UniqueName: \"kubernetes.io/projected/09dba9e2-ce5f-46b8-a09f-8fa332e68991-kube-api-access-gk5rm\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.238835 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.238759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-client-certs\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.339900 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.339866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-metrics-server-audit-profiles\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.339919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/09dba9e2-ce5f-46b8-a09f-8fa332e68991-audit-log\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.339946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-client-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.339976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-tls\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.339996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.340037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5rm\" (UniqueName: \"kubernetes.io/projected/09dba9e2-ce5f-46b8-a09f-8fa332e68991-kube-api-access-gk5rm\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340378 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.340102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-client-certs\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340514 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.340434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/09dba9e2-ce5f-46b8-a09f-8fa332e68991-audit-log\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.340981 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.340961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-metrics-server-audit-profiles\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.341102 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.341066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09dba9e2-ce5f-46b8-a09f-8fa332e68991-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.343026 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.342993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-client-ca-bundle\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.343180 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.343131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-tls\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.343180 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.343174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/09dba9e2-ce5f-46b8-a09f-8fa332e68991-secret-metrics-server-client-certs\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.347824 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.347804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5rm\" (UniqueName: \"kubernetes.io/projected/09dba9e2-ce5f-46b8-a09f-8fa332e68991-kube-api-access-gk5rm\") pod \"metrics-server-698654f8b9-vltwc\" (UID: \"09dba9e2-ce5f-46b8-a09f-8fa332e68991\") " pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.459966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.459880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:07:49.518073 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.517786 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-9vf4q"] Apr 16 22:07:49.521804 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.521651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:07:49.524655 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.524479 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:07:49.524655 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.524640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:07:49.524821 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.524718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-g74hc\"" Apr 16 22:07:49.530783 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.530762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-9vf4q"] Apr 16 22:07:49.600749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.600718 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-698654f8b9-vltwc"] Apr 16 22:07:49.642434 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.642365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpvx\" (UniqueName: \"kubernetes.io/projected/3672d731-f084-4334-a03d-3a333467d313-kube-api-access-2mpvx\") pod \"downloads-6bcc868b7-9vf4q\" (UID: \"3672d731-f084-4334-a03d-3a333467d313\") " pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:07:49.743913 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.743851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpvx\" (UniqueName: \"kubernetes.io/projected/3672d731-f084-4334-a03d-3a333467d313-kube-api-access-2mpvx\") pod \"downloads-6bcc868b7-9vf4q\" (UID: \"3672d731-f084-4334-a03d-3a333467d313\") " pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:07:49.752151 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.752112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpvx\" (UniqueName: \"kubernetes.io/projected/3672d731-f084-4334-a03d-3a333467d313-kube-api-access-2mpvx\") pod \"downloads-6bcc868b7-9vf4q\" (UID: \"3672d731-f084-4334-a03d-3a333467d313\") " pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:07:49.839440 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:49.839405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:07:49.854287 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:49.854255 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dba9e2_ce5f_46b8_a09f_8fa332e68991.slice/crio-c9693c21b16a2cbbfaf64356dd511c5c63657479457e0827c49e4ce129a6bbb6 WatchSource:0}: Error finding container c9693c21b16a2cbbfaf64356dd511c5c63657479457e0827c49e4ce129a6bbb6: Status 404 returned error can't find the container with id c9693c21b16a2cbbfaf64356dd511c5c63657479457e0827c49e4ce129a6bbb6 Apr 16 22:07:50.520002 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:50.519970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"6a585fe4a7b392a16d8f4fb62c2874b7c1c113342d9e33b4b13e14921c99ad15"} Apr 16 22:07:50.522687 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:50.522651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a"} Apr 16 22:07:50.525531 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:50.525447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" event={"ID":"09dba9e2-ce5f-46b8-a09f-8fa332e68991","Type":"ContainerStarted","Data":"c9693c21b16a2cbbfaf64356dd511c5c63657479457e0827c49e4ce129a6bbb6"} Apr 16 22:07:50.546860 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:50.546830 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-9vf4q"] Apr 16 22:07:50.551498 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:50.551472 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3672d731_f084_4334_a03d_3a333467d313.slice/crio-6ec0c1e5208a4b671f1586c80d8c5f4e7a891fdf340851b3d22be433ddaaeb5f WatchSource:0}: Error finding container 6ec0c1e5208a4b671f1586c80d8c5f4e7a891fdf340851b3d22be433ddaaeb5f: Status 404 returned error can't find the container with id 6ec0c1e5208a4b671f1586c80d8c5f4e7a891fdf340851b3d22be433ddaaeb5f Apr 16 22:07:51.019112 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.019084 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:07:51.022976 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.022958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.025265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:07:51.025265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025231 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:07:51.025265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025260 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:07:51.025496 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025292 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-458er8nm4et8a\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025744 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025815 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025836 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025848 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025874 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:07:51.025964 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025950 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:07:51.026354 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.025986 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:07:51.026354 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.026049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bxj9z\"" Apr 16 22:07:51.026354 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.026281 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:07:51.027593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.027573 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:07:51.035609 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.035581 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:07:51.054387 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054523 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054523 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054628 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054628 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054628 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054628 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47t6q\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054829 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054829 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054829 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054829 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.054829 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.054996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.055090 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.055068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156571 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47t6q\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.156830 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.157294 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.157294 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.157294 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.157294 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.156956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.162234 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.162202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.162465 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.162424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.162768 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.162729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.163359 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.162948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.166691 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.165711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.166691 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.166370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.169394 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.169004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.170148 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.169690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.171871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.171799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.172529 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.172455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47t6q\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.176197 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.175501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.176197 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.175774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.176346 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.176248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.177764 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.177738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.177859 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.177775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.177859 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.177850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.179318 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.179232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.179411 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.179398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.335923 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.335829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:07:51.530216 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.530176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-9vf4q" event={"ID":"3672d731-f084-4334-a03d-3a333467d313","Type":"ContainerStarted","Data":"6ec0c1e5208a4b671f1586c80d8c5f4e7a891fdf340851b3d22be433ddaaeb5f"} Apr 16 22:07:51.532812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.532782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"fd20679c7a381ee9c32eed1ff101a31bae6a69400f5b92a007a3be1e4e279003"} Apr 16 22:07:51.532927 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.532818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"2a0f2efdb7be70bb30be76841949e8492d06e4d848c9fa8cd9158326c0d73fbd"} Apr 16 22:07:51.535873 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.535842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8"} Apr 16 22:07:51.535977 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.535876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24"} Apr 16 22:07:51.535977 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.535891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d"} Apr 16 22:07:51.535977 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.535902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54"} Apr 16 22:07:51.859484 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:51.859440 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:07:51.867725 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:07:51.867695 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a264a8_cb5e_4129_952c_8aa774856233.slice/crio-e831a90a451c98a8ee86836641fabb1070dd81e375b8590b8f1e0823514fda21 WatchSource:0}: Error finding container e831a90a451c98a8ee86836641fabb1070dd81e375b8590b8f1e0823514fda21: Status 404 returned error can't find the container with id e831a90a451c98a8ee86836641fabb1070dd81e375b8590b8f1e0823514fda21 Apr 16 22:07:52.542607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.542563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerStarted","Data":"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02"} Apr 16 22:07:52.544181 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.544146 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" exitCode=0 Apr 16 22:07:52.544299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.544224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} Apr 16 22:07:52.544299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.544256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"e831a90a451c98a8ee86836641fabb1070dd81e375b8590b8f1e0823514fda21"} Apr 16 22:07:52.545900 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.545786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" event={"ID":"09dba9e2-ce5f-46b8-a09f-8fa332e68991","Type":"ContainerStarted","Data":"e77d0555d5c33732b7368ecd8582ee92312700272f13bbe896fc3aca1ecc160b"} Apr 16 22:07:52.548925 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.548902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"a57a629358a5120e0829d678d895f538f95c9cd0875cc1f6e224700c6465467f"} Apr 16 22:07:52.549104 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.548932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"e83e2dda01005dce2863953aa87ad1eb6a6553e8f56dc547e56d06bcb6f0dbfb"} Apr 16 22:07:52.549104 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.548944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" event={"ID":"cc1e439f-b750-4821-90f6-eeb916e4509b","Type":"ContainerStarted","Data":"51ad0552545dc387878dc863782f93788d490aee93da3d91ee8b51ddbd057ae7"} Apr 16 22:07:52.549213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.549113 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:07:52.568191 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.567531 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.835363679 podStartE2EDuration="7.567517355s" podCreationTimestamp="2026-04-16 22:07:45 +0000 UTC" firstStartedPulling="2026-04-16 22:07:47.01092069 +0000 UTC m=+172.590423402" lastFinishedPulling="2026-04-16 22:07:51.743074357 +0000 UTC m=+177.322577078" observedRunningTime="2026-04-16 22:07:52.56638327 +0000 UTC m=+178.145886036" watchObservedRunningTime="2026-04-16 22:07:52.567517355 +0000 UTC m=+178.147020084" Apr 16 22:07:52.613021 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.612949 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" podStartSLOduration=1.7607312259999999 podStartE2EDuration="3.612930668s" podCreationTimestamp="2026-04-16 22:07:49 +0000 UTC" firstStartedPulling="2026-04-16 22:07:49.856095642 +0000 UTC m=+175.435598348" lastFinishedPulling="2026-04-16 22:07:51.708295067 +0000 UTC m=+177.287797790" observedRunningTime="2026-04-16 22:07:52.588986368 +0000 UTC m=+178.168489100" watchObservedRunningTime="2026-04-16 22:07:52.612930668 +0000 UTC m=+178.192433397" Apr 16 22:07:52.629479 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:52.629429 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" podStartSLOduration=2.298319871 podStartE2EDuration="5.629411035s" podCreationTimestamp="2026-04-16 22:07:47 +0000 UTC" firstStartedPulling="2026-04-16 22:07:48.378322015 +0000 UTC m=+173.957824724" lastFinishedPulling="2026-04-16 22:07:51.709413175 +0000 UTC m=+177.288915888" observedRunningTime="2026-04-16 22:07:52.628807363 +0000 UTC m=+178.208310090" watchObservedRunningTime="2026-04-16 22:07:52.629411035 +0000 UTC m=+178.208913765" Apr 16 22:07:55.566416 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:55.564366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} Apr 16 22:07:55.566416 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:55.564405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} Apr 16 22:07:56.358478 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.358449 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:07:56.571087 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.571049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} Apr 16 22:07:56.571087 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.571091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} Apr 16 22:07:56.571087 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.571105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} Apr 16 22:07:56.571613 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.571119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerStarted","Data":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} Apr 16 22:07:56.595666 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:56.595413 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.7381794 podStartE2EDuration="6.595396006s" podCreationTimestamp="2026-04-16 22:07:50 +0000 UTC" firstStartedPulling="2026-04-16 22:07:52.545508361 +0000 UTC m=+178.125011081" lastFinishedPulling="2026-04-16 22:07:55.402724976 +0000 UTC m=+180.982227687" observedRunningTime="2026-04-16 22:07:56.593913785 +0000 UTC m=+182.173416512" watchObservedRunningTime="2026-04-16 22:07:56.595396006 +0000 UTC m=+182.174898734" Apr 16 22:07:58.560291 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:07:58.560265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-54579f7659-wm42w" Apr 16 22:08:01.336056 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:01.336018 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:08:01.373243 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:01.373205 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" containerID="cri-o://db4a52d424c5a0be576e13c525972a4cf6e904314f28201e75bc64e201ab0f49" gracePeriod=30 Apr 16 22:08:06.354202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.354163 2575 patch_prober.go:28] interesting pod/image-registry-758b7d7d5d-mz6gt container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.133.0.10:5000/healthz\": dial tcp 10.133.0.10:5000: connect: connection refused" start-of-body= Apr 16 22:08:06.354675 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.354232 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" probeResult="failure" output="Get \"https://10.133.0.10:5000/healthz\": dial tcp 10.133.0.10:5000: connect: connection refused" Apr 16 22:08:06.603578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.603549 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerID="db4a52d424c5a0be576e13c525972a4cf6e904314f28201e75bc64e201ab0f49" exitCode=0 Apr 16 22:08:06.603704 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.603633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" event={"ID":"3ffd8efd-9ec4-439f-9898-b02d42e45549","Type":"ContainerDied","Data":"db4a52d424c5a0be576e13c525972a4cf6e904314f28201e75bc64e201ab0f49"} Apr 16 22:08:06.647929 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.647908 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:08:06.812162 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812127 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812170 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812214 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812241 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g855\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812259 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812500 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812314 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812500 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812356 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812500 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812415 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets\") pod \"3ffd8efd-9ec4-439f-9898-b02d42e45549\" (UID: \"3ffd8efd-9ec4-439f-9898-b02d42e45549\") " Apr 16 22:08:06.812648 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812527 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:06.812744 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812722 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-certificates\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.812943 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.812919 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:06.815123 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.815096 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:06.815238 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.815211 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855" (OuterVolumeSpecName: "kube-api-access-8g855") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "kube-api-access-8g855". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:06.815295 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.815232 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:06.815335 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.815304 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:06.815536 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.815521 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:06.824172 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.824143 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3ffd8efd-9ec4-439f-9898-b02d42e45549" (UID: "3ffd8efd-9ec4-439f-9898-b02d42e45549"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:08:06.913672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913594 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ffd8efd-9ec4-439f-9898-b02d42e45549-ca-trust-extracted\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913630 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-registry-tls\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913647 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-image-registry-private-configuration\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913662 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ffd8efd-9ec4-439f-9898-b02d42e45549-installation-pull-secrets\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913675 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-bound-sa-token\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913986 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913685 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ffd8efd-9ec4-439f-9898-b02d42e45549-trusted-ca\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:06.913986 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:06.913699 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8g855\" (UniqueName: \"kubernetes.io/projected/3ffd8efd-9ec4-439f-9898-b02d42e45549-kube-api-access-8g855\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:08:07.608143 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.608102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" event={"ID":"3ffd8efd-9ec4-439f-9898-b02d42e45549","Type":"ContainerDied","Data":"860854993c3db90fdaf2e6f9d815f46f2b7e50c8a2f1a367a3defb823684d83b"} Apr 16 22:08:07.608590 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.608158 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-758b7d7d5d-mz6gt" Apr 16 22:08:07.608590 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.608181 2575 scope.go:117] "RemoveContainer" containerID="db4a52d424c5a0be576e13c525972a4cf6e904314f28201e75bc64e201ab0f49" Apr 16 22:08:07.609985 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.609960 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-9vf4q" event={"ID":"3672d731-f084-4334-a03d-3a333467d313","Type":"ContainerStarted","Data":"083ccb5c078c4d220c7199ff627fe9308b3e024ae455e4cbe2dec0dd136c6276"} Apr 16 22:08:07.610186 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.610163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:08:07.622951 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.622930 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-9vf4q" Apr 16 22:08:07.628768 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.628713 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-9vf4q" podStartSLOduration=2.596272239 podStartE2EDuration="18.62866881s" podCreationTimestamp="2026-04-16 22:07:49 +0000 UTC" firstStartedPulling="2026-04-16 22:07:50.553909599 +0000 UTC m=+176.133412303" lastFinishedPulling="2026-04-16 22:08:06.586306164 +0000 UTC m=+192.165808874" observedRunningTime="2026-04-16 22:08:07.628102132 +0000 UTC m=+193.207604858" watchObservedRunningTime="2026-04-16 22:08:07.62866881 +0000 UTC m=+193.208171538" Apr 16 22:08:07.641401 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.641373 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:08:07.644562 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:07.644538 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-758b7d7d5d-mz6gt"] Apr 16 22:08:09.012772 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:09.012739 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" path="/var/lib/kubelet/pods/3ffd8efd-9ec4-439f-9898-b02d42e45549/volumes" Apr 16 22:08:09.460434 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:09.460358 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:08:09.460585 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:09.460451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:08:16.455696 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:16.455624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/init-config-reloader/0.log" Apr 16 22:08:16.655141 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:16.655113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/alertmanager/0.log" Apr 16 22:08:16.855238 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:16.855211 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/config-reloader/0.log" Apr 16 22:08:17.055866 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:17.055838 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/kube-rbac-proxy-web/0.log" Apr 16 22:08:17.255508 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:17.255479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/kube-rbac-proxy/0.log" Apr 16 22:08:17.456517 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:17.456488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/kube-rbac-proxy-metric/0.log" Apr 16 22:08:17.655063 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:17.654971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/prom-label-proxy/0.log" Apr 16 22:08:17.859394 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:17.859368 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-xbj2b_8aa498c8-4e70-44a5-8cf3-8c5794a14bc9/cluster-monitoring-operator/0.log" Apr 16 22:08:18.654910 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:18.654881 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-698654f8b9-vltwc_09dba9e2-ce5f-46b8-a09f-8fa332e68991/metrics-server/0.log" Apr 16 22:08:19.655370 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:19.655342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/init-textfile/0.log" Apr 16 22:08:19.855755 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:19.855725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/node-exporter/0.log" Apr 16 22:08:20.054880 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:20.054848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/kube-rbac-proxy/0.log" Apr 16 22:08:20.855861 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:20.855832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/kube-rbac-proxy-main/0.log" Apr 16 22:08:21.058058 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:21.058033 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/kube-rbac-proxy-self/0.log" Apr 16 22:08:21.255037 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:21.254998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/openshift-state-metrics/0.log" Apr 16 22:08:21.455084 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:21.455061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/init-config-reloader/0.log" Apr 16 22:08:21.660215 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:21.660141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/prometheus/0.log" Apr 16 22:08:21.855441 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:21.855418 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/config-reloader/0.log" Apr 16 22:08:22.055812 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:22.055789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/thanos-sidecar/0.log" Apr 16 22:08:22.255429 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:22.255400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/kube-rbac-proxy-web/0.log" Apr 16 22:08:22.455031 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:22.454945 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/kube-rbac-proxy/0.log" Apr 16 22:08:22.655475 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:22.655448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92a264a8-cb5e-4129-952c-8aa774856233/kube-rbac-proxy-thanos/0.log" Apr 16 22:08:23.255047 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:23.255001 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-4trnb_481c641d-358b-4737-befb-5b91970311c7/prometheus-operator-admission-webhook/0.log" Apr 16 22:08:23.455699 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:23.455674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/thanos-query/0.log" Apr 16 22:08:23.655422 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:23.655350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-web/0.log" Apr 16 22:08:23.855496 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:23.855467 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy/0.log" Apr 16 22:08:24.059037 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:24.056134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/prom-label-proxy/0.log" Apr 16 22:08:24.254634 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:24.254594 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-rules/0.log" Apr 16 22:08:24.455607 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:24.455530 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-metrics/0.log" Apr 16 22:08:25.657169 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:25.657138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-9vf4q_3672d731-f084-4334-a03d-3a333467d313/download-server/0.log" Apr 16 22:08:26.455347 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:26.455317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fkq5z_3cb4878c-59b3-48d2-8c2e-646f1605bf4e/serve-healthcheck-canary/0.log" Apr 16 22:08:29.466140 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:29.466107 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:08:29.470375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:29.470351 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-698654f8b9-vltwc" Apr 16 22:08:51.336481 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:51.336445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:08:51.413303 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:51.413264 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:08:51.758409 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:08:51.758382 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:05.052778 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.052687 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053084 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="alertmanager" containerID="cri-o://63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" gracePeriod=120 Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053127 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy" containerID="cri-o://c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" gracePeriod=120 Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053152 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="prom-label-proxy" containerID="cri-o://ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" gracePeriod=120 Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053145 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="config-reloader" containerID="cri-o://3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" gracePeriod=120 Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053181 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-metric" containerID="cri-o://1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" gracePeriod=120 Apr 16 22:09:05.053306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.053131 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-web" containerID="cri-o://aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" gracePeriod=120 Apr 16 22:09:05.699229 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.699195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:09:05.701456 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.701431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0e5fb7-90e1-4234-a572-2eeac57ba8d9-metrics-certs\") pod \"network-metrics-daemon-wqzqv\" (UID: \"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9\") " pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:09:05.783965 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.783939 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" exitCode=0 Apr 16 22:09:05.783965 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.783960 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" exitCode=0 Apr 16 22:09:05.783965 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.783967 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" exitCode=0 Apr 16 22:09:05.783965 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.783972 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" exitCode=0 Apr 16 22:09:05.784195 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.784026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02"} Apr 16 22:09:05.784195 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.784060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24"} Apr 16 22:09:05.784195 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.784073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54"} Apr 16 22:09:05.784195 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.784085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a"} Apr 16 22:09:05.911304 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.911280 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:09:05.918946 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:05.918927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqzqv" Apr 16 22:09:06.034059 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.034024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wqzqv"] Apr 16 22:09:06.036885 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:09:06.036859 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0e5fb7_90e1_4234_a572_2eeac57ba8d9.slice/crio-3c3a584a20e593ed210cf28f4e78d5f258ead6cdbd174c04c3aebb4b9c8b7043 WatchSource:0}: Error finding container 3c3a584a20e593ed210cf28f4e78d5f258ead6cdbd174c04c3aebb4b9c8b7043: Status 404 returned error can't find the container with id 3c3a584a20e593ed210cf28f4e78d5f258ead6cdbd174c04c3aebb4b9c8b7043 Apr 16 22:09:06.298032 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.297988 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:06.404723 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404694 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.404893 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404746 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.404893 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404792 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.404893 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404827 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.404893 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404861 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.404893 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404892 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404922 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.404949 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvgpm\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405001 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405062 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405090 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405118 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405177 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405164 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db\") pod \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\" (UID: \"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf\") " Apr 16 22:09:06.405937 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405671 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:06.405937 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405686 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:09:06.405937 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.405898 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:06.407626 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.407596 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume" (OuterVolumeSpecName: "config-volume") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.408075 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408030 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.408182 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408069 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.408182 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408088 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.408182 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408101 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.408604 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:09:06.408973 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.408959 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm" (OuterVolumeSpecName: "kube-api-access-rvgpm") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "kube-api-access-rvgpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:09:06.409350 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.409328 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out" (OuterVolumeSpecName: "config-out") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:09:06.412058 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.411985 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.417619 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.417600 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config" (OuterVolumeSpecName: "web-config") pod "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" (UID: "5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:06.506113 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506085 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-volume\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506116 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506134 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506149 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-config-out\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506172 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvgpm\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-kube-api-access-rvgpm\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506186 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-cluster-tls-config\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506213 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506200 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-metrics-client-ca\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506214 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506229 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-tls-assets\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506243 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-alertmanager-main-db\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506255 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-web-config\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506271 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-main-tls\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.506427 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.506285 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:06.788742 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.788701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqzqv" event={"ID":"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9","Type":"ContainerStarted","Data":"3c3a584a20e593ed210cf28f4e78d5f258ead6cdbd174c04c3aebb4b9c8b7043"} Apr 16 22:09:06.791661 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791627 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" exitCode=0 Apr 16 22:09:06.791661 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791656 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerID="aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" exitCode=0 Apr 16 22:09:06.791828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8"} Apr 16 22:09:06.791828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d"} Apr 16 22:09:06.791828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf","Type":"ContainerDied","Data":"f40b429fb7d04ae798ec8754234b22523104763aa05411436340c8cb6ba0cbd0"} Apr 16 22:09:06.791828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791752 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:06.791828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.791769 2575 scope.go:117] "RemoveContainer" containerID="ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" Apr 16 22:09:06.814833 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.814810 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:06.818105 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.818076 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:06.841971 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.841947 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:06.842343 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842328 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" Apr 16 22:09:06.842404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842345 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" Apr 16 22:09:06.842404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842361 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-web" Apr 16 22:09:06.842404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842371 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-web" Apr 16 22:09:06.842404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842379 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy" Apr 16 22:09:06.842404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842387 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842411 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="prom-label-proxy" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842419 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="prom-label-proxy" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842433 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-metric" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842441 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-metric" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842451 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="init-config-reloader" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842460 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="init-config-reloader" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842471 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="alertmanager" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842477 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="alertmanager" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842484 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="config-reloader" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842489 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="config-reloader" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842536 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842544 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-web" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842556 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="prom-label-proxy" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842566 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="alertmanager" Apr 16 22:09:06.842580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842578 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ffd8efd-9ec4-439f-9898-b02d42e45549" containerName="registry" Apr 16 22:09:06.843153 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842588 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="config-reloader" Apr 16 22:09:06.843153 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.842598 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" containerName="kube-rbac-proxy-metric" Apr 16 22:09:06.846606 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.846590 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:06.848697 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.848678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:09:06.848817 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.848674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:09:06.849126 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849096 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:09:06.849126 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:09:06.849265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:09:06.849265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:09:06.849265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-74x5v\"" Apr 16 22:09:06.849416 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849291 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:09:06.849808 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.849775 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:09:06.853738 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.853719 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:09:06.856456 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:06.856436 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:07.003157 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.003134 2575 scope.go:117] "RemoveContainer" containerID="1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" Apr 16 22:09:07.010661 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010748 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010671 2575 scope.go:117] "RemoveContainer" containerID="c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" Apr 16 22:09:07.010748 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010853 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010853 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mp2k\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-kube-api-access-4mp2k\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010940 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010940 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.010940 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011118 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-out\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011118 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.010993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-volume\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011118 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.011063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011118 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.011093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011289 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.011164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-web-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.011289 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.011191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.012593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.012567 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf" path="/var/lib/kubelet/pods/5a11ddc3-da0b-4ce4-af87-2cec6aa5dccf/volumes" Apr 16 22:09:07.018069 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.018050 2575 scope.go:117] "RemoveContainer" containerID="aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" Apr 16 22:09:07.046857 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.046826 2575 scope.go:117] "RemoveContainer" containerID="3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" Apr 16 22:09:07.054105 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.054080 2575 scope.go:117] "RemoveContainer" containerID="63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" Apr 16 22:09:07.065226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.065210 2575 scope.go:117] "RemoveContainer" containerID="4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348" Apr 16 22:09:07.077929 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.077867 2575 scope.go:117] "RemoveContainer" containerID="ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" Apr 16 22:09:07.078190 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.078155 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02\": container with ID starting with ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02 not found: ID does not exist" containerID="ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" Apr 16 22:09:07.078292 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078201 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02"} err="failed to get container status \"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02\": rpc error: code = NotFound desc = could not find container \"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02\": container with ID starting with ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02 not found: ID does not exist" Apr 16 22:09:07.078292 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078237 2575 scope.go:117] "RemoveContainer" containerID="1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" Apr 16 22:09:07.078645 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.078619 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8\": container with ID starting with 1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8 not found: ID does not exist" containerID="1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" Apr 16 22:09:07.078725 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078654 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8"} err="failed to get container status \"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8\": rpc error: code = NotFound desc = could not find container \"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8\": container with ID starting with 1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8 not found: ID does not exist" Apr 16 22:09:07.078725 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078678 2575 scope.go:117] "RemoveContainer" containerID="c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" Apr 16 22:09:07.078962 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.078929 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24\": container with ID starting with c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24 not found: ID does not exist" containerID="c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" Apr 16 22:09:07.079081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078964 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24"} err="failed to get container status \"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24\": rpc error: code = NotFound desc = could not find container \"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24\": container with ID starting with c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24 not found: ID does not exist" Apr 16 22:09:07.079081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.078988 2575 scope.go:117] "RemoveContainer" containerID="aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" Apr 16 22:09:07.079326 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.079304 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d\": container with ID starting with aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d not found: ID does not exist" containerID="aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" Apr 16 22:09:07.079393 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079336 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d"} err="failed to get container status \"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d\": rpc error: code = NotFound desc = could not find container \"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d\": container with ID starting with aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d not found: ID does not exist" Apr 16 22:09:07.079393 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079357 2575 scope.go:117] "RemoveContainer" containerID="3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" Apr 16 22:09:07.079653 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.079632 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54\": container with ID starting with 3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54 not found: ID does not exist" containerID="3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" Apr 16 22:09:07.079728 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079658 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54"} err="failed to get container status \"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54\": rpc error: code = NotFound desc = could not find container \"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54\": container with ID starting with 3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54 not found: ID does not exist" Apr 16 22:09:07.079728 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079676 2575 scope.go:117] "RemoveContainer" containerID="63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" Apr 16 22:09:07.079926 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.079909 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a\": container with ID starting with 63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a not found: ID does not exist" containerID="63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" Apr 16 22:09:07.079991 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079931 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a"} err="failed to get container status \"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a\": rpc error: code = NotFound desc = could not find container \"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a\": container with ID starting with 63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a not found: ID does not exist" Apr 16 22:09:07.079991 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.079949 2575 scope.go:117] "RemoveContainer" containerID="4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348" Apr 16 22:09:07.080209 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:07.080170 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348\": container with ID starting with 4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348 not found: ID does not exist" containerID="4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348" Apr 16 22:09:07.080297 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.080204 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348"} err="failed to get container status \"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348\": rpc error: code = NotFound desc = could not find container \"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348\": container with ID starting with 4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348 not found: ID does not exist" Apr 16 22:09:07.080297 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.080227 2575 scope.go:117] "RemoveContainer" containerID="ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02" Apr 16 22:09:07.081061 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.080973 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02"} err="failed to get container status \"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02\": rpc error: code = NotFound desc = could not find container \"ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02\": container with ID starting with ad60a8b1ad7169683bf3eb71ddf8d21877aee668a691d1c33e5c2b539acade02 not found: ID does not exist" Apr 16 22:09:07.081061 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.081002 2575 scope.go:117] "RemoveContainer" containerID="1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8" Apr 16 22:09:07.081806 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.081299 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8"} err="failed to get container status \"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8\": rpc error: code = NotFound desc = could not find container \"1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8\": container with ID starting with 1a200539823dc058bfe620160bfdccf1585bdcdf83b633872988562ad6e0b8b8 not found: ID does not exist" Apr 16 22:09:07.081806 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.081326 2575 scope.go:117] "RemoveContainer" containerID="c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24" Apr 16 22:09:07.081806 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.081640 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24"} err="failed to get container status \"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24\": rpc error: code = NotFound desc = could not find container \"c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24\": container with ID starting with c7538980b600ae7f0bb7cacf3e630f51871b55eca1941777775d1db7a54ddd24 not found: ID does not exist" Apr 16 22:09:07.081806 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.081663 2575 scope.go:117] "RemoveContainer" containerID="aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d" Apr 16 22:09:07.082173 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082002 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d"} err="failed to get container status \"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d\": rpc error: code = NotFound desc = could not find container \"aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d\": container with ID starting with aa4c3943aa12d9b6bd9d7ed4aeece9208198d939add18b81997bf7377c36d28d not found: ID does not exist" Apr 16 22:09:07.082173 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082044 2575 scope.go:117] "RemoveContainer" containerID="3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54" Apr 16 22:09:07.082414 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54"} err="failed to get container status \"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54\": rpc error: code = NotFound desc = could not find container \"3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54\": container with ID starting with 3991e7df7c2249c4eff8c42ae93bbb9c62e7623981cab3e9a25cf9a35c3d2e54 not found: ID does not exist" Apr 16 22:09:07.082488 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082416 2575 scope.go:117] "RemoveContainer" containerID="63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a" Apr 16 22:09:07.082696 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082658 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a"} err="failed to get container status \"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a\": rpc error: code = NotFound desc = could not find container \"63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a\": container with ID starting with 63ccd363af0b237043108e8dbb7b4496a9e4dde1a3f1a4376c01e440bfff7a2a not found: ID does not exist" Apr 16 22:09:07.082790 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082690 2575 scope.go:117] "RemoveContainer" containerID="4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348" Apr 16 22:09:07.083040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.082998 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348"} err="failed to get container status \"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348\": rpc error: code = NotFound desc = could not find container \"4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348\": container with ID starting with 4f271f2caccb43a72faa23012c5fb250a102567392e110a7bed13b3da6126348 not found: ID does not exist" Apr 16 22:09:07.112505 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-web-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112631 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112693 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mp2k\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-kube-api-access-4mp2k\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-out\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-volume\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.112993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.112987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.113604 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.113264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.113682 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.113622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.114522 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.114493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3df593f4-d44b-4912-af6f-cc22bdce7c54-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.115875 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.115826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.118381 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.118333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.118511 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.118453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-volume\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.118915 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.118888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.119344 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.119304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.119590 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.119561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3df593f4-d44b-4912-af6f-cc22bdce7c54-config-out\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.119678 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.119654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.119932 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.119912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-web-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.120672 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.120629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mp2k\" (UniqueName: \"kubernetes.io/projected/3df593f4-d44b-4912-af6f-cc22bdce7c54-kube-api-access-4mp2k\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.121300 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.121278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3df593f4-d44b-4912-af6f-cc22bdce7c54-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3df593f4-d44b-4912-af6f-cc22bdce7c54\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.158665 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.158644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:09:07.298575 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.298521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:09:07.301257 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:09:07.301234 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df593f4_d44b_4912_af6f_cc22bdce7c54.slice/crio-ad0cffa90a6b1fe053fde6a083318a560ed951ce5a21fc34193b7ffd2f2e9319 WatchSource:0}: Error finding container ad0cffa90a6b1fe053fde6a083318a560ed951ce5a21fc34193b7ffd2f2e9319: Status 404 returned error can't find the container with id ad0cffa90a6b1fe053fde6a083318a560ed951ce5a21fc34193b7ffd2f2e9319 Apr 16 22:09:07.796154 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.796120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqzqv" event={"ID":"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9","Type":"ContainerStarted","Data":"569564d9f323a3dfe605dbf8ab3f65bce28050a083705ac781d53fbe18079f34"} Apr 16 22:09:07.796303 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.796160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqzqv" event={"ID":"ef0e5fb7-90e1-4234-a572-2eeac57ba8d9","Type":"ContainerStarted","Data":"fe940b1eebfb0bcb4d871a7ebad6202b5e7a92d4b9c22696587292b18e1068df"} Apr 16 22:09:07.797478 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.797452 2575 generic.go:358] "Generic (PLEG): container finished" podID="3df593f4-d44b-4912-af6f-cc22bdce7c54" containerID="54a94d726049da457a06de94e249b7e1370f7b4d9c012e75e63446e9e161cdbc" exitCode=0 Apr 16 22:09:07.797604 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.797537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerDied","Data":"54a94d726049da457a06de94e249b7e1370f7b4d9c012e75e63446e9e161cdbc"} Apr 16 22:09:07.797604 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.797565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"ad0cffa90a6b1fe053fde6a083318a560ed951ce5a21fc34193b7ffd2f2e9319"} Apr 16 22:09:07.811911 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:07.811860 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wqzqv" podStartSLOduration=251.794979004 podStartE2EDuration="4m12.811844677s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:09:06.03862375 +0000 UTC m=+251.618126455" lastFinishedPulling="2026-04-16 22:09:07.055489414 +0000 UTC m=+252.634992128" observedRunningTime="2026-04-16 22:09:07.810236083 +0000 UTC m=+253.389738811" watchObservedRunningTime="2026-04-16 22:09:07.811844677 +0000 UTC m=+253.391347406" Apr 16 22:09:08.805272 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"42043166722d779114f981c3c9b8f36e58dbea9019b2bfa29354aba97937de38"} Apr 16 22:09:08.805272 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805276 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"6c6d5423f1b101567e1972e9e69b54e7fb06d6c9fcbbe73fa8f48b2b5de316da"} Apr 16 22:09:08.805656 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"31b02727919422973ca9051731f7d4282a17077589eb175e4db78f0f9e5b4325"} Apr 16 22:09:08.805656 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"52fd61e84f3cd797eaf68daa7560b115b44ec44e90ef35fd267558d16d3f75d9"} Apr 16 22:09:08.805656 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"e0fbad307494fd3414ec711127ce0769001bd6ea269dac63262f73bcf8e888f4"} Apr 16 22:09:08.805656 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.805313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3df593f4-d44b-4912-af6f-cc22bdce7c54","Type":"ContainerStarted","Data":"91b93fb8d0c062a8cb67e5b77915afa677ad683180e78fc4507b317c31fe55d4"} Apr 16 22:09:08.830718 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:08.830599 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.830580231 podStartE2EDuration="2.830580231s" podCreationTimestamp="2026-04-16 22:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:09:08.828069586 +0000 UTC m=+254.407572314" watchObservedRunningTime="2026-04-16 22:09:08.830580231 +0000 UTC m=+254.410082959" Apr 16 22:09:09.080647 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.080564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-577757d8d8-rdn4x"] Apr 16 22:09:09.084472 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.084450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.086541 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.086521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 22:09:09.086835 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.086817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 22:09:09.086950 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.086923 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-lc9hv\"" Apr 16 22:09:09.087034 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.086961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 22:09:09.087121 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.087093 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 22:09:09.087322 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.087306 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 22:09:09.091762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.091564 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 22:09:09.096483 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.096460 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-577757d8d8-rdn4x"] Apr 16 22:09:09.130119 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130092 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130211 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-metrics-client-ca\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-trusted-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-federate-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-serving-certs-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.130307 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.130290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rh2j\" (UniqueName: \"kubernetes.io/projected/fab58a82-0471-4bb7-bc78-efbd4adc3dea-kube-api-access-9rh2j\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.230620 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.230594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-trusted-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.230757 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.230629 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-federate-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.230757 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.230648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-serving-certs-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.230874 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.230852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rh2j\" (UniqueName: \"kubernetes.io/projected/fab58a82-0471-4bb7-bc78-efbd4adc3dea-kube-api-access-9rh2j\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.230935 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.230919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231086 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231228 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231228 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-metrics-client-ca\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231482 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-serving-certs-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231576 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-trusted-ca-bundle\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.231786 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.231763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab58a82-0471-4bb7-bc78-efbd4adc3dea-metrics-client-ca\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.233293 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.233273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-federate-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.233379 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.233350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.233685 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.233664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-telemeter-client-tls\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.233817 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.233796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fab58a82-0471-4bb7-bc78-efbd4adc3dea-secret-telemeter-client\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.238079 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.238053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rh2j\" (UniqueName: \"kubernetes.io/projected/fab58a82-0471-4bb7-bc78-efbd4adc3dea-kube-api-access-9rh2j\") pod \"telemeter-client-577757d8d8-rdn4x\" (UID: \"fab58a82-0471-4bb7-bc78-efbd4adc3dea\") " pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.363205 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.363134 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:09.363782 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.363750 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="prometheus" containerID="cri-o://9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" gracePeriod=600 Apr 16 22:09:09.364763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.363921 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy" containerID="cri-o://937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" gracePeriod=600 Apr 16 22:09:09.364763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.364043 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-web" containerID="cri-o://083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" gracePeriod=600 Apr 16 22:09:09.364763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.364071 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" gracePeriod=600 Apr 16 22:09:09.364763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.364114 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="thanos-sidecar" containerID="cri-o://fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" gracePeriod=600 Apr 16 22:09:09.364763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.364159 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="config-reloader" containerID="cri-o://ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" gracePeriod=600 Apr 16 22:09:09.394724 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.394702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" Apr 16 22:09:09.520125 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.520058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-577757d8d8-rdn4x"] Apr 16 22:09:09.522227 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:09:09.522200 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab58a82_0471_4bb7_bc78_efbd4adc3dea.slice/crio-f7870c824afeccce3cf770c64e50c7e0055bab8ce5a8554b27d167b2a3a41100 WatchSource:0}: Error finding container f7870c824afeccce3cf770c64e50c7e0055bab8ce5a8554b27d167b2a3a41100: Status 404 returned error can't find the container with id f7870c824afeccce3cf770c64e50c7e0055bab8ce5a8554b27d167b2a3a41100 Apr 16 22:09:09.614990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.614941 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.634323 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634304 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634412 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634341 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634412 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634360 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634510 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634493 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47t6q\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634542 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634527 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634574 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634559 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634709 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634678 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634724 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634771 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634839 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634806 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634839 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634916 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634952 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.634989 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.634980 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.635269 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.635025 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.635269 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.635051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.635269 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.635083 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"92a264a8-cb5e-4129-952c-8aa774856233\" (UID: \"92a264a8-cb5e-4129-952c-8aa774856233\") " Apr 16 22:09:09.635856 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.635830 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:09.636507 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.635747 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:09.636507 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.636193 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:09.636507 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.636224 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:09.636507 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.636243 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:09.637063 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.637037 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:09:09.639612 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.639495 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q" (OuterVolumeSpecName: "kube-api-access-47t6q") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "kube-api-access-47t6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:09:09.639612 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.639571 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.640065 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.639958 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.641255 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.641219 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.641355 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.641268 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.641355 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.641308 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.641478 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.641369 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.642579 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.642544 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.642823 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.642787 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out" (OuterVolumeSpecName: "config-out") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:09:09.642993 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.642955 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config" (OuterVolumeSpecName: "config") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.643795 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.643744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:09:09.653192 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.653167 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config" (OuterVolumeSpecName: "web-config") pod "92a264a8-cb5e-4129-952c-8aa774856233" (UID: "92a264a8-cb5e-4129-952c-8aa774856233"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:09.735702 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735677 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-kube-rbac-proxy\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735702 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735701 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735712 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735722 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735733 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-configmap-metrics-client-ca\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735743 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-config\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735753 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735763 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-grpc-tls\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735772 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735781 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735790 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47t6q\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-kube-api-access-47t6q\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735799 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735808 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735817 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-web-config\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735825 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-prometheus-k8s-db\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.735831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735833 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92a264a8-cb5e-4129-952c-8aa774856233-config-out\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.736237 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735842 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92a264a8-cb5e-4129-952c-8aa774856233-tls-assets\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.736237 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.735850 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92a264a8-cb5e-4129-952c-8aa774856233-secret-metrics-client-certs\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:09:09.810459 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810434 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" exitCode=0 Apr 16 22:09:09.810459 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810457 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" exitCode=0 Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810470 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" exitCode=0 Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810479 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" exitCode=0 Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810487 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" exitCode=0 Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810494 2575 generic.go:358] "Generic (PLEG): container finished" podID="92a264a8-cb5e-4129-952c-8aa774856233" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" exitCode=0 Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810544 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810606 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92a264a8-cb5e-4129-952c-8aa774856233","Type":"ContainerDied","Data":"e831a90a451c98a8ee86836641fabb1070dd81e375b8590b8f1e0823514fda21"} Apr 16 22:09:09.810878 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.810627 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.811670 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.811615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" event={"ID":"fab58a82-0471-4bb7-bc78-efbd4adc3dea","Type":"ContainerStarted","Data":"f7870c824afeccce3cf770c64e50c7e0055bab8ce5a8554b27d167b2a3a41100"} Apr 16 22:09:09.818169 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.818131 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.824748 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.824731 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.831483 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.831357 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.832818 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.832797 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:09.835749 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.835723 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:09.839227 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.839209 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.845464 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.845443 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.852064 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.852045 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.858256 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858233 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:09.858316 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858273 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.858530 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.858507 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.858593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858542 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.858593 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858566 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.858687 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858642 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="config-reloader" Apr 16 22:09:09.858687 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858658 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="config-reloader" Apr 16 22:09:09.858687 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858681 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="init-config-reloader" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858690 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="init-config-reloader" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858703 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="prometheus" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858710 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="prometheus" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858722 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="thanos-sidecar" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858730 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="thanos-sidecar" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858739 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-web" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858747 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-web" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858763 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-thanos" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858771 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-thanos" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858783 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858793 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy" Apr 16 22:09:09.858814 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.858792 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858826 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858849 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858867 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="prometheus" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858880 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-web" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858892 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy-thanos" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858904 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="kube-rbac-proxy" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858916 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="thanos-sidecar" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.858927 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a264a8-cb5e-4129-952c-8aa774856233" containerName="config-reloader" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.859066 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859089 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.859284 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859112 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.859327 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859343 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859364 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.859567 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859606 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.859739 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859626 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.859934 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.859874 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.859934 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859897 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.859934 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.859910 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.860218 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:09:09.860201 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.860262 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860221 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.860262 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860239 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.860457 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860437 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.860498 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860458 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.860668 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860652 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.860705 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860669 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.860873 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860854 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.860911 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.860875 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.861100 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861083 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.861139 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861101 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.861327 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861309 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.861382 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861330 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.861527 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861511 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.861569 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861527 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.861707 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861692 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.861744 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861708 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.861874 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861858 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.861924 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.861884 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.862118 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862100 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.862190 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862120 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.862344 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862327 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.862386 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862346 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.862536 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.862583 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862537 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.862718 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862702 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.862755 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862717 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.862909 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862892 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.862967 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.862911 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.863127 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863112 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.863172 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863127 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.863295 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863281 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.863337 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863294 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.863455 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863442 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.863505 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863456 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.863642 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863623 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.863681 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863644 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.863864 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863845 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.863906 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.863865 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.864103 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864083 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.864157 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864105 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.864271 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864257 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.864332 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864308 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.864379 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864335 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.864762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864740 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.864762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864761 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.865048 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.864988 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.865048 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865004 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.865245 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865226 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.865311 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865247 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.865457 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865438 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.865513 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865458 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.865692 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865675 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.865762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865694 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.865943 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865925 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.865987 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.865944 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.866206 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866180 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.866206 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866205 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.866446 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866418 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.866501 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866446 2575 scope.go:117] "RemoveContainer" containerID="c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227" Apr 16 22:09:09.866541 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866528 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:09:09.866677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866649 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227"} err="failed to get container status \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": rpc error: code = NotFound desc = could not find container \"c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227\": container with ID starting with c0e36d00275e31eabc1842858d214b7fb191e6951ebb7b856a064c3bad641227 not found: ID does not exist" Apr 16 22:09:09.866677 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866674 2575 scope.go:117] "RemoveContainer" containerID="937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a" Apr 16 22:09:09.866819 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-458er8nm4et8a\"" Apr 16 22:09:09.866871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:09:09.866871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bxj9z\"" Apr 16 22:09:09.866871 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:09:09.867035 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866907 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a"} err="failed to get container status \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": rpc error: code = NotFound desc = could not find container \"937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a\": container with ID starting with 937febc5ebd05182402a57b8704f7da18acc6151888a9a78a44d6367247f813a not found: ID does not exist" Apr 16 22:09:09.867035 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866930 2575 scope.go:117] "RemoveContainer" containerID="083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31" Apr 16 22:09:09.867035 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.866847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:09:09.867196 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867044 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:09:09.867299 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867271 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31"} err="failed to get container status \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": rpc error: code = NotFound desc = could not find container \"083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31\": container with ID starting with 083430d14e6ce4a41cae585fcb09d76c6f70f5f90e12588b02ee2ff90c50dd31 not found: ID does not exist" Apr 16 22:09:09.867346 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867303 2575 scope.go:117] "RemoveContainer" containerID="fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9" Apr 16 22:09:09.867534 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:09:09.867639 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867614 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9"} err="failed to get container status \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": rpc error: code = NotFound desc = could not find container \"fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9\": container with ID starting with fa347a80cf4c91f3437228f11b9875d8f6c42c15589872b288d914001ba087a9 not found: ID does not exist" Apr 16 22:09:09.867639 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:09:09.867747 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867643 2575 scope.go:117] "RemoveContainer" containerID="ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2" Apr 16 22:09:09.867747 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867488 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:09:09.867747 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:09:09.867858 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:09:09.867931 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867909 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2"} err="failed to get container status \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": rpc error: code = NotFound desc = could not find container \"ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2\": container with ID starting with ed5da542d1d7b422aa96c1b540de53b9fbd03c4fef07cd8ed587f8e2663c78c2 not found: ID does not exist" Apr 16 22:09:09.867990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.867937 2575 scope.go:117] "RemoveContainer" containerID="9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2" Apr 16 22:09:09.868317 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.868296 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2"} err="failed to get container status \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": rpc error: code = NotFound desc = could not find container \"9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2\": container with ID starting with 9667fd8093caa79f815bb03fc4082eb9813b3a91a22d0a9a3a40de57a17868e2 not found: ID does not exist" Apr 16 22:09:09.868317 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.868319 2575 scope.go:117] "RemoveContainer" containerID="bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72" Apr 16 22:09:09.868644 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.868624 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72"} err="failed to get container status \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": rpc error: code = NotFound desc = could not find container \"bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72\": container with ID starting with bf7e7f2159524026b25d1d4224b842859359a5bf140ce0efba91f0f50e29bc72 not found: ID does not exist" Apr 16 22:09:09.869988 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.869970 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:09:09.874088 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.874067 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:09:09.874394 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.874375 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:09.937293 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937395 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937395 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5sc\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-kube-api-access-9z5sc\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937477 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937477 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937558 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937558 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937558 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937658 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937658 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937658 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937909 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:09.937909 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:09.937766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038402 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038551 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038615 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038615 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.038765 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5sc\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-kube-api-access-9z5sc\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.038973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.039037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.039063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039520 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.039388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.039520 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.039468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.041934 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.041905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45f95683-e1dc-42d4-8d51-f6135f368dc1-config-out\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.042328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.042486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.042957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.043298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.043433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.044257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.044745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.044877 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.044803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.049190 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.045819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.049190 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.046203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.049190 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.046348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-web-config\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.049190 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.046403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45f95683-e1dc-42d4-8d51-f6135f368dc1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.050898 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.050863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/45f95683-e1dc-42d4-8d51-f6135f368dc1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.051083 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.051061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5sc\" (UniqueName: \"kubernetes.io/projected/45f95683-e1dc-42d4-8d51-f6135f368dc1-kube-api-access-9z5sc\") pod \"prometheus-k8s-0\" (UID: \"45f95683-e1dc-42d4-8d51-f6135f368dc1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.176040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.175936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:10.310685 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.310660 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:09:10.313598 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:09:10.313547 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f95683_e1dc_42d4_8d51_f6135f368dc1.slice/crio-1733d42ca254182326ad71e31c4640adfebd16529b45362e624a43ad70df88a4 WatchSource:0}: Error finding container 1733d42ca254182326ad71e31c4640adfebd16529b45362e624a43ad70df88a4: Status 404 returned error can't find the container with id 1733d42ca254182326ad71e31c4640adfebd16529b45362e624a43ad70df88a4 Apr 16 22:09:10.816575 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.816537 2575 generic.go:358] "Generic (PLEG): container finished" podID="45f95683-e1dc-42d4-8d51-f6135f368dc1" containerID="4ea38813565fc0aed446ea4333a1afd949926bc36dce214c812485edc99cf194" exitCode=0 Apr 16 22:09:10.816930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.816622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerDied","Data":"4ea38813565fc0aed446ea4333a1afd949926bc36dce214c812485edc99cf194"} Apr 16 22:09:10.816930 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:10.816655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"1733d42ca254182326ad71e31c4640adfebd16529b45362e624a43ad70df88a4"} Apr 16 22:09:11.017674 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.016924 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a264a8-cb5e-4129-952c-8aa774856233" path="/var/lib/kubelet/pods/92a264a8-cb5e-4129-952c-8aa774856233/volumes" Apr 16 22:09:11.822815 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"c748e63689e8d7f78ce99dbf5f098eb622ec17f61bc76f7dc73be8237d0b39d0"} Apr 16 22:09:11.822815 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"528cd717a4b2de29d4aa893dd663593c4d7e5ac527a115195d969c43e2a6e118"} Apr 16 22:09:11.822815 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"a183fbcf124abcd9fab06a40ec5eb127930bf1ea79848dc4bbac977184291b07"} Apr 16 22:09:11.822815 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"c9d1e90d3cf03c9124e18cf169c954a0e94db448ac592c324f9eaec16675959f"} Apr 16 22:09:11.822815 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"56fb974998e1d7d9bd523cc37a2e70c54d419bc96af246900c120209146e803f"} Apr 16 22:09:11.823412 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.822827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"45f95683-e1dc-42d4-8d51-f6135f368dc1","Type":"ContainerStarted","Data":"1f7e09d6e85e07934b18b5cfa96023cb1659a23b8541a25ff723d021c19db6ee"} Apr 16 22:09:11.824691 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.824664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" event={"ID":"fab58a82-0471-4bb7-bc78-efbd4adc3dea","Type":"ContainerStarted","Data":"0eca8e6ccaceebebe833e65f483d4e1201f1c911ad723438696e8c91e608f5cb"} Apr 16 22:09:11.824786 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.824696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" event={"ID":"fab58a82-0471-4bb7-bc78-efbd4adc3dea","Type":"ContainerStarted","Data":"f7cdc08131122ff38cc6c0bd4e276ec3501f0db0b2b645b27b91b0b2daaa06b2"} Apr 16 22:09:11.824786 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.824709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" event={"ID":"fab58a82-0471-4bb7-bc78-efbd4adc3dea","Type":"ContainerStarted","Data":"23d673141f53fd24dc938ef605a8ed680dcfbff628a719b489ef0937199a5eb6"} Apr 16 22:09:11.848620 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.848572 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.848553768 podStartE2EDuration="2.848553768s" podCreationTimestamp="2026-04-16 22:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:09:11.845738771 +0000 UTC m=+257.425241499" watchObservedRunningTime="2026-04-16 22:09:11.848553768 +0000 UTC m=+257.428056496" Apr 16 22:09:11.866313 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:11.866275 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-577757d8d8-rdn4x" podStartSLOduration=1.005122692 podStartE2EDuration="2.866261706s" podCreationTimestamp="2026-04-16 22:09:09 +0000 UTC" firstStartedPulling="2026-04-16 22:09:09.524842411 +0000 UTC m=+255.104345115" lastFinishedPulling="2026-04-16 22:09:11.385981408 +0000 UTC m=+256.965484129" observedRunningTime="2026-04-16 22:09:11.864663481 +0000 UTC m=+257.444166208" watchObservedRunningTime="2026-04-16 22:09:11.866261706 +0000 UTC m=+257.445764433" Apr 16 22:09:15.176366 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:15.176332 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:09:54.901824 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:54.901791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:09:54.902606 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:54.902574 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:09:54.906025 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:09:54.905986 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:10:10.176602 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:10:10.176573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:10:10.190950 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:10:10.190930 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:10:11.012235 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:10:11.012209 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:11:26.111405 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.111372 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-zv2pk"] Apr 16 22:11:26.114578 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.114561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.116901 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.116880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-4q2qf\"" Apr 16 22:11:26.117031 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.116945 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 22:11:26.117031 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.116969 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 22:11:26.121747 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.121376 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-zv2pk"] Apr 16 22:11:26.216474 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.216445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.216474 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.216474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-kube-api-access-pdgw4\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.316868 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.316822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.316868 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.316871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-kube-api-access-pdgw4\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.324465 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.324428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.324579 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.324564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/56d82b0a-b1ab-472c-8271-823bb67d84fd-kube-api-access-pdgw4\") pod \"cert-manager-webhook-597b96b99b-zv2pk\" (UID: \"56d82b0a-b1ab-472c-8271-823bb67d84fd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.433167 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.433087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:26.550779 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.550756 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-zv2pk"] Apr 16 22:11:26.553553 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:11:26.553513 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d82b0a_b1ab_472c_8271_823bb67d84fd.slice/crio-cf1c2c6a7ca22a70f8e79b133144674ad69ffc214a7acb42123ce16191ccdc4c WatchSource:0}: Error finding container cf1c2c6a7ca22a70f8e79b133144674ad69ffc214a7acb42123ce16191ccdc4c: Status 404 returned error can't find the container with id cf1c2c6a7ca22a70f8e79b133144674ad69ffc214a7acb42123ce16191ccdc4c Apr 16 22:11:26.555320 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:26.555300 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:11:27.208892 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:27.208860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" event={"ID":"56d82b0a-b1ab-472c-8271-823bb67d84fd","Type":"ContainerStarted","Data":"cf1c2c6a7ca22a70f8e79b133144674ad69ffc214a7acb42123ce16191ccdc4c"} Apr 16 22:11:31.223584 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:31.223548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" event={"ID":"56d82b0a-b1ab-472c-8271-823bb67d84fd","Type":"ContainerStarted","Data":"91a513710191b7f01b31d094b308ed82df6946d45755833c30ff8ed823c48498"} Apr 16 22:11:31.223951 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:31.223647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:11:31.237343 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:31.237299 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" podStartSLOduration=1.159745079 podStartE2EDuration="5.237283123s" podCreationTimestamp="2026-04-16 22:11:26 +0000 UTC" firstStartedPulling="2026-04-16 22:11:26.555463557 +0000 UTC m=+392.134966262" lastFinishedPulling="2026-04-16 22:11:30.633001597 +0000 UTC m=+396.212504306" observedRunningTime="2026-04-16 22:11:31.236410769 +0000 UTC m=+396.815913496" watchObservedRunningTime="2026-04-16 22:11:31.237283123 +0000 UTC m=+396.816785852" Apr 16 22:11:37.228712 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:11:37.228676 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-zv2pk" Apr 16 22:12:01.196282 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.196246 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l"] Apr 16 22:12:01.199489 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.199469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.201625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.201603 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:12:01.201625 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.201614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 22:12:01.202600 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.202580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:12:01.202733 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.202647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 22:12:01.202733 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.202665 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-d5qw4\"" Apr 16 22:12:01.202851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.202769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 22:12:01.207887 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.207868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l"] Apr 16 22:12:01.295978 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.295945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dfb99abe-c737-48ae-b4f5-e9a63e81b883-manager-config\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.295978 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.295980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.296194 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.296006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdnr\" (UniqueName: \"kubernetes.io/projected/dfb99abe-c737-48ae-b4f5-e9a63e81b883-kube-api-access-ggdnr\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.296194 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.296089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.397347 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.397317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdnr\" (UniqueName: \"kubernetes.io/projected/dfb99abe-c737-48ae-b4f5-e9a63e81b883-kube-api-access-ggdnr\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.397479 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.397355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.397479 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.397421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dfb99abe-c737-48ae-b4f5-e9a63e81b883-manager-config\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.397479 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.397440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.397976 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.397954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dfb99abe-c737-48ae-b4f5-e9a63e81b883-manager-config\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.400088 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.400061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.400178 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.400061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dfb99abe-c737-48ae-b4f5-e9a63e81b883-cert\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.404650 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.404630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdnr\" (UniqueName: \"kubernetes.io/projected/dfb99abe-c737-48ae-b4f5-e9a63e81b883-kube-api-access-ggdnr\") pod \"lws-controller-manager-fd7d9b88b-9qk9l\" (UID: \"dfb99abe-c737-48ae-b4f5-e9a63e81b883\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.510363 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.510339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:01.623915 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:01.623893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l"] Apr 16 22:12:01.626118 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:12:01.626086 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb99abe_c737_48ae_b4f5_e9a63e81b883.slice/crio-b7203f61d37ea3c5043aff568cb0bc7e56a7f9910547303b3f5ca00ed9315e30 WatchSource:0}: Error finding container b7203f61d37ea3c5043aff568cb0bc7e56a7f9910547303b3f5ca00ed9315e30: Status 404 returned error can't find the container with id b7203f61d37ea3c5043aff568cb0bc7e56a7f9910547303b3f5ca00ed9315e30 Apr 16 22:12:02.316067 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:02.316036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" event={"ID":"dfb99abe-c737-48ae-b4f5-e9a63e81b883","Type":"ContainerStarted","Data":"b7203f61d37ea3c5043aff568cb0bc7e56a7f9910547303b3f5ca00ed9315e30"} Apr 16 22:12:05.325713 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:05.325680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" event={"ID":"dfb99abe-c737-48ae-b4f5-e9a63e81b883","Type":"ContainerStarted","Data":"601d16413debeba82eb064a6ce812d602eb48eb6415f482dc129241b7be06c81"} Apr 16 22:12:05.326115 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:05.325785 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:05.341219 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:05.341164 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" podStartSLOduration=1.648712997 podStartE2EDuration="4.34114708s" podCreationTimestamp="2026-04-16 22:12:01 +0000 UTC" firstStartedPulling="2026-04-16 22:12:01.628000845 +0000 UTC m=+427.207503551" lastFinishedPulling="2026-04-16 22:12:04.320434929 +0000 UTC m=+429.899937634" observedRunningTime="2026-04-16 22:12:05.340360805 +0000 UTC m=+430.919863532" watchObservedRunningTime="2026-04-16 22:12:05.34114708 +0000 UTC m=+430.920649808" Apr 16 22:12:12.134646 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.134612 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9"] Apr 16 22:12:12.137226 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.137203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.139939 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.139912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 22:12:12.140098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.139936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 22:12:12.140098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.139949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 22:12:12.140098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.139942 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-52lvq\"" Apr 16 22:12:12.140098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.139958 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 22:12:12.154229 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.154206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9"] Apr 16 22:12:12.192076 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.192043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxwgp\" (UniqueName: \"kubernetes.io/projected/b352caf5-923a-4829-b55f-d85295c912c7-kube-api-access-lxwgp\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.192205 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.192094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.192205 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.192162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.293510 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.293479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxwgp\" (UniqueName: \"kubernetes.io/projected/b352caf5-923a-4829-b55f-d85295c912c7-kube-api-access-lxwgp\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.293637 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.293522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.293637 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.293544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.295976 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.295948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.296101 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.296035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b352caf5-923a-4829-b55f-d85295c912c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.301104 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.301084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxwgp\" (UniqueName: \"kubernetes.io/projected/b352caf5-923a-4829-b55f-d85295c912c7-kube-api-access-lxwgp\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-zhwr9\" (UID: \"b352caf5-923a-4829-b55f-d85295c912c7\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.448376 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.448305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:12.572197 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:12.572170 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9"] Apr 16 22:12:12.574850 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:12:12.574819 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb352caf5_923a_4829_b55f_d85295c912c7.slice/crio-e8180096f8ad1a2ef0b5ddc5c650fa0950ac1f4dc9ea4f883832839467fda08e WatchSource:0}: Error finding container e8180096f8ad1a2ef0b5ddc5c650fa0950ac1f4dc9ea4f883832839467fda08e: Status 404 returned error can't find the container with id e8180096f8ad1a2ef0b5ddc5c650fa0950ac1f4dc9ea4f883832839467fda08e Apr 16 22:12:13.353590 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:13.353545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" event={"ID":"b352caf5-923a-4829-b55f-d85295c912c7","Type":"ContainerStarted","Data":"e8180096f8ad1a2ef0b5ddc5c650fa0950ac1f4dc9ea4f883832839467fda08e"} Apr 16 22:12:15.361994 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:15.361955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" event={"ID":"b352caf5-923a-4829-b55f-d85295c912c7","Type":"ContainerStarted","Data":"cbc04e752662275dca1b72565a2282b10cb5d2a484da610d44bce82b87a521d6"} Apr 16 22:12:15.362385 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:15.362116 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:15.380697 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:15.380650 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" podStartSLOduration=1.001866865 podStartE2EDuration="3.380636455s" podCreationTimestamp="2026-04-16 22:12:12 +0000 UTC" firstStartedPulling="2026-04-16 22:12:12.576846245 +0000 UTC m=+438.156348950" lastFinishedPulling="2026-04-16 22:12:14.955615835 +0000 UTC m=+440.535118540" observedRunningTime="2026-04-16 22:12:15.379442809 +0000 UTC m=+440.958945537" watchObservedRunningTime="2026-04-16 22:12:15.380636455 +0000 UTC m=+440.960139182" Apr 16 22:12:16.330647 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:16.330617 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-9qk9l" Apr 16 22:12:26.368074 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:26.368043 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-zhwr9" Apr 16 22:12:29.174851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.174818 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk"] Apr 16 22:12:29.177313 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.177292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.180633 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.180609 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 22:12:29.180759 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.180631 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lrjtd\"" Apr 16 22:12:29.180828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.180645 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 22:12:29.180828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.180655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:12:29.180944 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.180707 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:12:29.186507 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.186487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk"] Apr 16 22:12:29.232362 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.232328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265jf\" (UniqueName: \"kubernetes.io/projected/5bafe1f8-e2aa-417f-b252-6f62857af20d-kube-api-access-265jf\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.232493 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.232409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bafe1f8-e2aa-417f-b252-6f62857af20d-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.232493 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.232460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bafe1f8-e2aa-417f-b252-6f62857af20d-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.333814 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.333779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-265jf\" (UniqueName: \"kubernetes.io/projected/5bafe1f8-e2aa-417f-b252-6f62857af20d-kube-api-access-265jf\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.334043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.333860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bafe1f8-e2aa-417f-b252-6f62857af20d-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.334043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.333916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bafe1f8-e2aa-417f-b252-6f62857af20d-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.336237 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.336209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bafe1f8-e2aa-417f-b252-6f62857af20d-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.336388 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.336333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bafe1f8-e2aa-417f-b252-6f62857af20d-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.342282 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.342253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-265jf\" (UniqueName: \"kubernetes.io/projected/5bafe1f8-e2aa-417f-b252-6f62857af20d-kube-api-access-265jf\") pod \"kube-auth-proxy-6c9f6bcb5c-v8zxk\" (UID: \"5bafe1f8-e2aa-417f-b252-6f62857af20d\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.488279 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.488254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" Apr 16 22:12:29.609745 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:29.609717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk"] Apr 16 22:12:29.612672 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:12:29.612647 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bafe1f8_e2aa_417f_b252_6f62857af20d.slice/crio-ab0b2165fb10847fac31b1fdce75960a02da8d065e71c5b117a1289e40f15b8a WatchSource:0}: Error finding container ab0b2165fb10847fac31b1fdce75960a02da8d065e71c5b117a1289e40f15b8a: Status 404 returned error can't find the container with id ab0b2165fb10847fac31b1fdce75960a02da8d065e71c5b117a1289e40f15b8a Apr 16 22:12:30.411980 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:30.411940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" event={"ID":"5bafe1f8-e2aa-417f-b252-6f62857af20d","Type":"ContainerStarted","Data":"ab0b2165fb10847fac31b1fdce75960a02da8d065e71c5b117a1289e40f15b8a"} Apr 16 22:12:32.420270 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:32.420230 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" event={"ID":"5bafe1f8-e2aa-417f-b252-6f62857af20d","Type":"ContainerStarted","Data":"9334fe6cfc6d9b7aa95b5d474a9dc6df1985c93d70a6629a004053881aa0c117"} Apr 16 22:12:32.437856 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:12:32.437796 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-v8zxk" podStartSLOduration=0.804917259 podStartE2EDuration="3.4377806s" podCreationTimestamp="2026-04-16 22:12:29 +0000 UTC" firstStartedPulling="2026-04-16 22:12:29.61480176 +0000 UTC m=+455.194304465" lastFinishedPulling="2026-04-16 22:12:32.24766508 +0000 UTC m=+457.827167806" observedRunningTime="2026-04-16 22:12:32.436390069 +0000 UTC m=+458.015892796" watchObservedRunningTime="2026-04-16 22:12:32.4377806 +0000 UTC m=+458.017283346" Apr 16 22:14:08.622828 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.622757 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sbm8z"] Apr 16 22:14:08.625206 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.625189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:08.627435 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.627412 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:14:08.628288 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.628267 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:14:08.628404 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.628305 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-7kps8\"" Apr 16 22:14:08.633900 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.633870 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sbm8z"] Apr 16 22:14:08.640040 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.639998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99sp\" (UniqueName: \"kubernetes.io/projected/590b3d5a-e3a4-4895-92d0-0fc383e05b3b-kube-api-access-k99sp\") pod \"authorino-operator-657f44b778-sbm8z\" (UID: \"590b3d5a-e3a4-4895-92d0-0fc383e05b3b\") " pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:08.741189 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.741160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k99sp\" (UniqueName: \"kubernetes.io/projected/590b3d5a-e3a4-4895-92d0-0fc383e05b3b-kube-api-access-k99sp\") pod \"authorino-operator-657f44b778-sbm8z\" (UID: \"590b3d5a-e3a4-4895-92d0-0fc383e05b3b\") " pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:08.748521 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.748492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99sp\" (UniqueName: \"kubernetes.io/projected/590b3d5a-e3a4-4895-92d0-0fc383e05b3b-kube-api-access-k99sp\") pod \"authorino-operator-657f44b778-sbm8z\" (UID: \"590b3d5a-e3a4-4895-92d0-0fc383e05b3b\") " pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:08.936918 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:08.936848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:09.072831 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:09.072807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-sbm8z"] Apr 16 22:14:09.075252 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:14:09.075211 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590b3d5a_e3a4_4895_92d0_0fc383e05b3b.slice/crio-fa00e367e6e5c1590344abed1706621875208a997b70b07860e4f7bf1f9f2087 WatchSource:0}: Error finding container fa00e367e6e5c1590344abed1706621875208a997b70b07860e4f7bf1f9f2087: Status 404 returned error can't find the container with id fa00e367e6e5c1590344abed1706621875208a997b70b07860e4f7bf1f9f2087 Apr 16 22:14:09.738968 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:09.738930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" event={"ID":"590b3d5a-e3a4-4895-92d0-0fc383e05b3b","Type":"ContainerStarted","Data":"fa00e367e6e5c1590344abed1706621875208a997b70b07860e4f7bf1f9f2087"} Apr 16 22:14:10.744811 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:10.744781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" event={"ID":"590b3d5a-e3a4-4895-92d0-0fc383e05b3b","Type":"ContainerStarted","Data":"6d75b31e0881b85553656db8dd767dfe5fc2c42327f7c3ff35256a8c6d7efbda"} Apr 16 22:14:10.745223 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:10.744936 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:10.760987 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:10.760898 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" podStartSLOduration=1.328802497 podStartE2EDuration="2.760880113s" podCreationTimestamp="2026-04-16 22:14:08 +0000 UTC" firstStartedPulling="2026-04-16 22:14:09.077121713 +0000 UTC m=+554.656624419" lastFinishedPulling="2026-04-16 22:14:10.509199326 +0000 UTC m=+556.088702035" observedRunningTime="2026-04-16 22:14:10.759845691 +0000 UTC m=+556.339348419" watchObservedRunningTime="2026-04-16 22:14:10.760880113 +0000 UTC m=+556.340382840" Apr 16 22:14:21.750743 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:21.750711 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-sbm8z" Apr 16 22:14:54.926667 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:54.926631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:14:54.928787 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:14:54.928769 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:15:07.628098 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.628065 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:07.630779 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.630752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:07.632935 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.632916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-72qbh\"" Apr 16 22:15:07.640266 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.640243 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:07.751208 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.751182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh87v\" (UniqueName: \"kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v\") pod \"authorino-f99f4b5cd-xqvrb\" (UID: \"4086a620-fcda-4f4b-bd17-91f54ff4506b\") " pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:07.802209 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.797094 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:07.802209 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.801078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:07.805470 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.805430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:07.852442 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.852415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnkg\" (UniqueName: \"kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg\") pod \"authorino-7498df8756-jkcfk\" (UID: \"0af1bb81-4494-4da5-a648-e3f28372fec9\") " pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:07.852564 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.852467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh87v\" (UniqueName: \"kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v\") pod \"authorino-f99f4b5cd-xqvrb\" (UID: \"4086a620-fcda-4f4b-bd17-91f54ff4506b\") " pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:07.859670 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.859644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh87v\" (UniqueName: \"kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v\") pod \"authorino-f99f4b5cd-xqvrb\" (UID: \"4086a620-fcda-4f4b-bd17-91f54ff4506b\") " pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:07.942342 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.942285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:07.953306 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.953279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnkg\" (UniqueName: \"kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg\") pod \"authorino-7498df8756-jkcfk\" (UID: \"0af1bb81-4494-4da5-a648-e3f28372fec9\") " pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:07.960761 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:07.960718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnkg\" (UniqueName: \"kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg\") pod \"authorino-7498df8756-jkcfk\" (UID: \"0af1bb81-4494-4da5-a648-e3f28372fec9\") " pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:08.057589 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:08.057568 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:08.060049 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:15:08.060022 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4086a620_fcda_4f4b_bd17_91f54ff4506b.slice/crio-3153d924520a5c3ea90976d00b6eafe2c5b46c672c0bd695a5658a24d4b84193 WatchSource:0}: Error finding container 3153d924520a5c3ea90976d00b6eafe2c5b46c672c0bd695a5658a24d4b84193: Status 404 returned error can't find the container with id 3153d924520a5c3ea90976d00b6eafe2c5b46c672c0bd695a5658a24d4b84193 Apr 16 22:15:08.111853 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:08.111831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:08.227027 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:08.226944 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:08.230432 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:15:08.230408 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af1bb81_4494_4da5_a648_e3f28372fec9.slice/crio-a73fb214fb6f9c9b02355918b0495b7500c8181d77ae47a1aa89174ab96322e9 WatchSource:0}: Error finding container a73fb214fb6f9c9b02355918b0495b7500c8181d77ae47a1aa89174ab96322e9: Status 404 returned error can't find the container with id a73fb214fb6f9c9b02355918b0495b7500c8181d77ae47a1aa89174ab96322e9 Apr 16 22:15:08.951538 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:08.951498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" event={"ID":"4086a620-fcda-4f4b-bd17-91f54ff4506b","Type":"ContainerStarted","Data":"3153d924520a5c3ea90976d00b6eafe2c5b46c672c0bd695a5658a24d4b84193"} Apr 16 22:15:08.952869 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:08.952835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jkcfk" event={"ID":"0af1bb81-4494-4da5-a648-e3f28372fec9","Type":"ContainerStarted","Data":"a73fb214fb6f9c9b02355918b0495b7500c8181d77ae47a1aa89174ab96322e9"} Apr 16 22:15:10.964502 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:10.964466 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jkcfk" event={"ID":"0af1bb81-4494-4da5-a648-e3f28372fec9","Type":"ContainerStarted","Data":"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1"} Apr 16 22:15:10.965668 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:10.965645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" event={"ID":"4086a620-fcda-4f4b-bd17-91f54ff4506b","Type":"ContainerStarted","Data":"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583"} Apr 16 22:15:10.979581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:10.979527 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-jkcfk" podStartSLOduration=1.81447998 podStartE2EDuration="3.979512065s" podCreationTimestamp="2026-04-16 22:15:07 +0000 UTC" firstStartedPulling="2026-04-16 22:15:08.231705742 +0000 UTC m=+613.811208446" lastFinishedPulling="2026-04-16 22:15:10.396737812 +0000 UTC m=+615.976240531" observedRunningTime="2026-04-16 22:15:10.978097481 +0000 UTC m=+616.557600207" watchObservedRunningTime="2026-04-16 22:15:10.979512065 +0000 UTC m=+616.559014793" Apr 16 22:15:10.995144 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:10.995084 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" podStartSLOduration=1.669647592 podStartE2EDuration="3.995066524s" podCreationTimestamp="2026-04-16 22:15:07 +0000 UTC" firstStartedPulling="2026-04-16 22:15:08.061719994 +0000 UTC m=+613.641222702" lastFinishedPulling="2026-04-16 22:15:10.387138915 +0000 UTC m=+615.966641634" observedRunningTime="2026-04-16 22:15:10.992226494 +0000 UTC m=+616.571729224" watchObservedRunningTime="2026-04-16 22:15:10.995066524 +0000 UTC m=+616.574569253" Apr 16 22:15:11.022705 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:11.022676 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:12.972750 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:12.972710 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" podUID="4086a620-fcda-4f4b-bd17-91f54ff4506b" containerName="authorino" containerID="cri-o://d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583" gracePeriod=30 Apr 16 22:15:13.209636 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.209611 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:13.300100 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.300077 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh87v\" (UniqueName: \"kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v\") pod \"4086a620-fcda-4f4b-bd17-91f54ff4506b\" (UID: \"4086a620-fcda-4f4b-bd17-91f54ff4506b\") " Apr 16 22:15:13.302115 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.302094 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v" (OuterVolumeSpecName: "kube-api-access-lh87v") pod "4086a620-fcda-4f4b-bd17-91f54ff4506b" (UID: "4086a620-fcda-4f4b-bd17-91f54ff4506b"). InnerVolumeSpecName "kube-api-access-lh87v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:13.401006 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.400983 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lh87v\" (UniqueName: \"kubernetes.io/projected/4086a620-fcda-4f4b-bd17-91f54ff4506b-kube-api-access-lh87v\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:15:13.977283 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.977248 2575 generic.go:358] "Generic (PLEG): container finished" podID="4086a620-fcda-4f4b-bd17-91f54ff4506b" containerID="d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583" exitCode=0 Apr 16 22:15:13.977649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.977307 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" Apr 16 22:15:13.977649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.977341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" event={"ID":"4086a620-fcda-4f4b-bd17-91f54ff4506b","Type":"ContainerDied","Data":"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583"} Apr 16 22:15:13.977649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.977384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xqvrb" event={"ID":"4086a620-fcda-4f4b-bd17-91f54ff4506b","Type":"ContainerDied","Data":"3153d924520a5c3ea90976d00b6eafe2c5b46c672c0bd695a5658a24d4b84193"} Apr 16 22:15:13.977649 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.977402 2575 scope.go:117] "RemoveContainer" containerID="d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583" Apr 16 22:15:13.985709 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.985690 2575 scope.go:117] "RemoveContainer" containerID="d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583" Apr 16 22:15:13.985978 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:15:13.985957 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583\": container with ID starting with d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583 not found: ID does not exist" containerID="d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583" Apr 16 22:15:13.986083 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.985989 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583"} err="failed to get container status \"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583\": rpc error: code = NotFound desc = could not find container \"d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583\": container with ID starting with d67366699b634875bece6eb81d5efe6d74c44f068ef7b263460873f3fb42e583 not found: ID does not exist" Apr 16 22:15:13.995858 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.995837 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:13.999700 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:13.999677 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xqvrb"] Apr 16 22:15:15.012782 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:15.012751 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4086a620-fcda-4f4b-bd17-91f54ff4506b" path="/var/lib/kubelet/pods/4086a620-fcda-4f4b-bd17-91f54ff4506b/volumes" Apr 16 22:15:37.005386 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.005354 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:37.005826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.005725 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4086a620-fcda-4f4b-bd17-91f54ff4506b" containerName="authorino" Apr 16 22:15:37.005826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.005737 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4086a620-fcda-4f4b-bd17-91f54ff4506b" containerName="authorino" Apr 16 22:15:37.005826 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.005802 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4086a620-fcda-4f4b-bd17-91f54ff4506b" containerName="authorino" Apr 16 22:15:37.007783 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.007761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:37.013759 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.013736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:37.097031 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.096983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp69p\" (UniqueName: \"kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p\") pod \"authorino-8b475cf9f-mkbfd\" (UID: \"9bfb4d02-794c-48b3-ae64-941dc9c39aad\") " pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:37.197929 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.197896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp69p\" (UniqueName: \"kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p\") pod \"authorino-8b475cf9f-mkbfd\" (UID: \"9bfb4d02-794c-48b3-ae64-941dc9c39aad\") " pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:37.205239 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.205210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp69p\" (UniqueName: \"kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p\") pod \"authorino-8b475cf9f-mkbfd\" (UID: \"9bfb4d02-794c-48b3-ae64-941dc9c39aad\") " pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:37.229276 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.229249 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:37.229458 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.229447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:37.253043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.253005 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-dcf94799c-mgqd7"] Apr 16 22:15:37.255882 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.255833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:37.264921 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.264902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dcf94799c-mgqd7"] Apr 16 22:15:37.345709 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.345683 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:37.348635 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:15:37.348600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bfb4d02_794c_48b3_ae64_941dc9c39aad.slice/crio-6b67f711cf6ac921e2b1486d32f464b585e35d9171d2c9d2f8f39bf48553c78a WatchSource:0}: Error finding container 6b67f711cf6ac921e2b1486d32f464b585e35d9171d2c9d2f8f39bf48553c78a: Status 404 returned error can't find the container with id 6b67f711cf6ac921e2b1486d32f464b585e35d9171d2c9d2f8f39bf48553c78a Apr 16 22:15:37.350725 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.350702 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dcf94799c-mgqd7"] Apr 16 22:15:37.350951 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:15:37.350931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vj7lq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-dcf94799c-mgqd7" podUID="05b778d2-4450-4e25-9182-0d662f9cb62d" Apr 16 22:15:37.369131 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.369108 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:15:37.371545 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.371528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.373728 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.373710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 22:15:37.378505 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.378485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:15:37.400304 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.400278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj7lq\" (UniqueName: \"kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq\") pod \"authorino-dcf94799c-mgqd7\" (UID: \"05b778d2-4450-4e25-9182-0d662f9cb62d\") " pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:37.501292 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.501267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj7lq\" (UniqueName: \"kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq\") pod \"authorino-dcf94799c-mgqd7\" (UID: \"05b778d2-4450-4e25-9182-0d662f9cb62d\") " pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:37.501391 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.501329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmn4p\" (UniqueName: \"kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.501391 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.501373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.508635 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.508581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj7lq\" (UniqueName: \"kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq\") pod \"authorino-dcf94799c-mgqd7\" (UID: \"05b778d2-4450-4e25-9182-0d662f9cb62d\") " pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:37.602730 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.602704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmn4p\" (UniqueName: \"kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.602834 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.602737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.604906 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.604888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.609561 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.609545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmn4p\" (UniqueName: \"kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p\") pod \"authorino-74cd789d9c-b74j5\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.682567 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.682536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:15:37.800057 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:37.800036 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:15:37.802375 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:15:37.802350 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2df57625_8419_43a2_b1d9_7588911763ce.slice/crio-154ad6e4b10a0d0f79fe11b8dbfc794d473d7a2e9c11076ddda6304bf4e45296 WatchSource:0}: Error finding container 154ad6e4b10a0d0f79fe11b8dbfc794d473d7a2e9c11076ddda6304bf4e45296: Status 404 returned error can't find the container with id 154ad6e4b10a0d0f79fe11b8dbfc794d473d7a2e9c11076ddda6304bf4e45296 Apr 16 22:15:38.064176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.064090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74cd789d9c-b74j5" event={"ID":"2df57625-8419-43a2-b1d9-7588911763ce","Type":"ContainerStarted","Data":"154ad6e4b10a0d0f79fe11b8dbfc794d473d7a2e9c11076ddda6304bf4e45296"} Apr 16 22:15:38.065402 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.065346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" event={"ID":"9bfb4d02-794c-48b3-ae64-941dc9c39aad","Type":"ContainerStarted","Data":"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553"} Apr 16 22:15:38.065545 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.065406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" event={"ID":"9bfb4d02-794c-48b3-ae64-941dc9c39aad","Type":"ContainerStarted","Data":"6b67f711cf6ac921e2b1486d32f464b585e35d9171d2c9d2f8f39bf48553c78a"} Apr 16 22:15:38.065545 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.065378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:38.065545 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.065393 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" podUID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" containerName="authorino" containerID="cri-o://970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553" gracePeriod=30 Apr 16 22:15:38.070287 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.070263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:38.080374 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.080336 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" podStartSLOduration=1.6821076590000001 podStartE2EDuration="2.080326125s" podCreationTimestamp="2026-04-16 22:15:36 +0000 UTC" firstStartedPulling="2026-04-16 22:15:37.349876074 +0000 UTC m=+642.929378779" lastFinishedPulling="2026-04-16 22:15:37.748094536 +0000 UTC m=+643.327597245" observedRunningTime="2026-04-16 22:15:38.078990097 +0000 UTC m=+643.658492826" watchObservedRunningTime="2026-04-16 22:15:38.080326125 +0000 UTC m=+643.659828852" Apr 16 22:15:38.208834 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.208808 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj7lq\" (UniqueName: \"kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq\") pod \"05b778d2-4450-4e25-9182-0d662f9cb62d\" (UID: \"05b778d2-4450-4e25-9182-0d662f9cb62d\") " Apr 16 22:15:38.211147 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.211099 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq" (OuterVolumeSpecName: "kube-api-access-vj7lq") pod "05b778d2-4450-4e25-9182-0d662f9cb62d" (UID: "05b778d2-4450-4e25-9182-0d662f9cb62d"). InnerVolumeSpecName "kube-api-access-vj7lq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:38.310056 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.309998 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vj7lq\" (UniqueName: \"kubernetes.io/projected/05b778d2-4450-4e25-9182-0d662f9cb62d-kube-api-access-vj7lq\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:15:38.324411 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.324391 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:38.410544 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.410515 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp69p\" (UniqueName: \"kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p\") pod \"9bfb4d02-794c-48b3-ae64-941dc9c39aad\" (UID: \"9bfb4d02-794c-48b3-ae64-941dc9c39aad\") " Apr 16 22:15:38.412842 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.412813 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p" (OuterVolumeSpecName: "kube-api-access-hp69p") pod "9bfb4d02-794c-48b3-ae64-941dc9c39aad" (UID: "9bfb4d02-794c-48b3-ae64-941dc9c39aad"). InnerVolumeSpecName "kube-api-access-hp69p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:38.511448 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:38.511425 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hp69p\" (UniqueName: \"kubernetes.io/projected/9bfb4d02-794c-48b3-ae64-941dc9c39aad-kube-api-access-hp69p\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:15:39.070093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.070059 2575 generic.go:358] "Generic (PLEG): container finished" podID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" containerID="970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553" exitCode=0 Apr 16 22:15:39.070509 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.070107 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" Apr 16 22:15:39.070509 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.070133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" event={"ID":"9bfb4d02-794c-48b3-ae64-941dc9c39aad","Type":"ContainerDied","Data":"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553"} Apr 16 22:15:39.070509 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.070171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mkbfd" event={"ID":"9bfb4d02-794c-48b3-ae64-941dc9c39aad","Type":"ContainerDied","Data":"6b67f711cf6ac921e2b1486d32f464b585e35d9171d2c9d2f8f39bf48553c78a"} Apr 16 22:15:39.070509 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.070188 2575 scope.go:117] "RemoveContainer" containerID="970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553" Apr 16 22:15:39.071674 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.071659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dcf94799c-mgqd7" Apr 16 22:15:39.071792 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.071762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74cd789d9c-b74j5" event={"ID":"2df57625-8419-43a2-b1d9-7588911763ce","Type":"ContainerStarted","Data":"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d"} Apr 16 22:15:39.078686 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.078658 2575 scope.go:117] "RemoveContainer" containerID="970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553" Apr 16 22:15:39.078936 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:15:39.078920 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553\": container with ID starting with 970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553 not found: ID does not exist" containerID="970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553" Apr 16 22:15:39.079001 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.078942 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553"} err="failed to get container status \"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553\": rpc error: code = NotFound desc = could not find container \"970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553\": container with ID starting with 970c72b7d86f624f6dc1484ac6db1673c2f8515e6e22cc8932c6e1f45b7c7553 not found: ID does not exist" Apr 16 22:15:39.086954 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.086914 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-74cd789d9c-b74j5" podStartSLOduration=1.612143221 podStartE2EDuration="2.086902224s" podCreationTimestamp="2026-04-16 22:15:37 +0000 UTC" firstStartedPulling="2026-04-16 22:15:37.804103291 +0000 UTC m=+643.383606012" lastFinishedPulling="2026-04-16 22:15:38.278862298 +0000 UTC m=+643.858365015" observedRunningTime="2026-04-16 22:15:39.085398871 +0000 UTC m=+644.664901598" watchObservedRunningTime="2026-04-16 22:15:39.086902224 +0000 UTC m=+644.666404951" Apr 16 22:15:39.111652 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.111621 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:39.111852 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.111827 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-jkcfk" podUID="0af1bb81-4494-4da5-a648-e3f28372fec9" containerName="authorino" containerID="cri-o://829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1" gracePeriod=30 Apr 16 22:15:39.115691 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.115667 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dcf94799c-mgqd7"] Apr 16 22:15:39.123950 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.123925 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-dcf94799c-mgqd7"] Apr 16 22:15:39.136417 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.136397 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:39.139660 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.139639 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mkbfd"] Apr 16 22:15:39.343862 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.343842 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:39.520851 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.520826 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnkg\" (UniqueName: \"kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg\") pod \"0af1bb81-4494-4da5-a648-e3f28372fec9\" (UID: \"0af1bb81-4494-4da5-a648-e3f28372fec9\") " Apr 16 22:15:39.522845 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.522816 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg" (OuterVolumeSpecName: "kube-api-access-zgnkg") pod "0af1bb81-4494-4da5-a648-e3f28372fec9" (UID: "0af1bb81-4494-4da5-a648-e3f28372fec9"). InnerVolumeSpecName "kube-api-access-zgnkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:39.621506 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:39.621450 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgnkg\" (UniqueName: \"kubernetes.io/projected/0af1bb81-4494-4da5-a648-e3f28372fec9-kube-api-access-zgnkg\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:15:40.077373 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.077342 2575 generic.go:358] "Generic (PLEG): container finished" podID="0af1bb81-4494-4da5-a648-e3f28372fec9" containerID="829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1" exitCode=0 Apr 16 22:15:40.077800 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.077392 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jkcfk" Apr 16 22:15:40.077800 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.077425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jkcfk" event={"ID":"0af1bb81-4494-4da5-a648-e3f28372fec9","Type":"ContainerDied","Data":"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1"} Apr 16 22:15:40.077800 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.077457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jkcfk" event={"ID":"0af1bb81-4494-4da5-a648-e3f28372fec9","Type":"ContainerDied","Data":"a73fb214fb6f9c9b02355918b0495b7500c8181d77ae47a1aa89174ab96322e9"} Apr 16 22:15:40.077800 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.077472 2575 scope.go:117] "RemoveContainer" containerID="829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1" Apr 16 22:15:40.086120 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.086103 2575 scope.go:117] "RemoveContainer" containerID="829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1" Apr 16 22:15:40.086365 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:15:40.086343 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1\": container with ID starting with 829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1 not found: ID does not exist" containerID="829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1" Apr 16 22:15:40.086432 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.086373 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1"} err="failed to get container status \"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1\": rpc error: code = NotFound desc = could not find container \"829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1\": container with ID starting with 829db907185da0bd3fd8784fe712b3b77c53036601da651f7622e5b2194f3af1 not found: ID does not exist" Apr 16 22:15:40.096153 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.096132 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:40.102052 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:40.102032 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-jkcfk"] Apr 16 22:15:41.012771 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:41.012738 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b778d2-4450-4e25-9182-0d662f9cb62d" path="/var/lib/kubelet/pods/05b778d2-4450-4e25-9182-0d662f9cb62d/volumes" Apr 16 22:15:41.013081 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:41.013063 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af1bb81-4494-4da5-a648-e3f28372fec9" path="/var/lib/kubelet/pods/0af1bb81-4494-4da5-a648-e3f28372fec9/volumes" Apr 16 22:15:41.013417 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:15:41.013400 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" path="/var/lib/kubelet/pods/9bfb4d02-794c-48b3-ae64-941dc9c39aad/volumes" Apr 16 22:16:21.549389 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.549351 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm"] Apr 16 22:16:21.549962 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.549942 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" containerName="authorino" Apr 16 22:16:21.550039 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.549967 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" containerName="authorino" Apr 16 22:16:21.550039 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.549989 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0af1bb81-4494-4da5-a648-e3f28372fec9" containerName="authorino" Apr 16 22:16:21.550039 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.549998 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af1bb81-4494-4da5-a648-e3f28372fec9" containerName="authorino" Apr 16 22:16:21.550193 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.550138 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bfb4d02-794c-48b3-ae64-941dc9c39aad" containerName="authorino" Apr 16 22:16:21.550193 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.550154 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0af1bb81-4494-4da5-a648-e3f28372fec9" containerName="authorino" Apr 16 22:16:21.553833 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.553815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.557165 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.557134 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 22:16:21.557295 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.557218 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 22:16:21.557431 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.557413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 22:16:21.557576 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.557553 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-khb59\"" Apr 16 22:16:21.559274 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.559253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm"] Apr 16 22:16:21.666004 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.665975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.666176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.666065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/567b1c65-886a-4f97-ba7a-05a1ffc26e66-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.666176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.666099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.666176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.666134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st62t\" (UniqueName: \"kubernetes.io/projected/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kube-api-access-st62t\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.666176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.666156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.666176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.666173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767624 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/567b1c65-886a-4f97-ba7a-05a1ffc26e66-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st62t\" (UniqueName: \"kubernetes.io/projected/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kube-api-access-st62t\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.767751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.767739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.768176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.768152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.768176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.768174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.768347 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.768225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.769991 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.769969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/567b1c65-886a-4f97-ba7a-05a1ffc26e66-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.770228 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.770210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/567b1c65-886a-4f97-ba7a-05a1ffc26e66-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.774881 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.774860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st62t\" (UniqueName: \"kubernetes.io/projected/567b1c65-886a-4f97-ba7a-05a1ffc26e66-kube-api-access-st62t\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-xb2pm\" (UID: \"567b1c65-886a-4f97-ba7a-05a1ffc26e66\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.865628 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.865548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:21.986768 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:21.986737 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm"] Apr 16 22:16:21.990159 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:16:21.990133 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod567b1c65_886a_4f97_ba7a_05a1ffc26e66.slice/crio-e7f2ae20dcf78ab8c05b8cba7c6f8b0f215da9b2ee5f0392d19e5cc17aeaafa5 WatchSource:0}: Error finding container e7f2ae20dcf78ab8c05b8cba7c6f8b0f215da9b2ee5f0392d19e5cc17aeaafa5: Status 404 returned error can't find the container with id e7f2ae20dcf78ab8c05b8cba7c6f8b0f215da9b2ee5f0392d19e5cc17aeaafa5 Apr 16 22:16:22.217393 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:22.217321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" event={"ID":"567b1c65-886a-4f97-ba7a-05a1ffc26e66","Type":"ContainerStarted","Data":"e7f2ae20dcf78ab8c05b8cba7c6f8b0f215da9b2ee5f0392d19e5cc17aeaafa5"} Apr 16 22:16:27.237608 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:27.237568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" event={"ID":"567b1c65-886a-4f97-ba7a-05a1ffc26e66","Type":"ContainerStarted","Data":"e4a960cc00ed534e536de3659dc7b00e04e9a985dbed0604818e70b3b1355c9d"} Apr 16 22:16:33.264073 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:33.264036 2575 generic.go:358] "Generic (PLEG): container finished" podID="567b1c65-886a-4f97-ba7a-05a1ffc26e66" containerID="e4a960cc00ed534e536de3659dc7b00e04e9a985dbed0604818e70b3b1355c9d" exitCode=0 Apr 16 22:16:33.264437 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:33.264099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" event={"ID":"567b1c65-886a-4f97-ba7a-05a1ffc26e66","Type":"ContainerDied","Data":"e4a960cc00ed534e536de3659dc7b00e04e9a985dbed0604818e70b3b1355c9d"} Apr 16 22:16:33.264740 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:33.264725 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:16:37.280414 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:37.280379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" event={"ID":"567b1c65-886a-4f97-ba7a-05a1ffc26e66","Type":"ContainerStarted","Data":"c1331b9b460c9cf5ab6328582647cccbb0d0ab15570647d28f85ebbe1dcd8a44"} Apr 16 22:16:37.280763 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:37.280584 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:16:37.296419 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:37.296358 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" podStartSLOduration=1.500758495 podStartE2EDuration="16.296345622s" podCreationTimestamp="2026-04-16 22:16:21 +0000 UTC" firstStartedPulling="2026-04-16 22:16:21.991821885 +0000 UTC m=+687.571324589" lastFinishedPulling="2026-04-16 22:16:36.787409008 +0000 UTC m=+702.366911716" observedRunningTime="2026-04-16 22:16:37.295462661 +0000 UTC m=+702.874965388" watchObservedRunningTime="2026-04-16 22:16:37.296345622 +0000 UTC m=+702.875848331" Apr 16 22:16:48.297202 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:16:48.297163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-xb2pm" Apr 16 22:17:10.853309 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:10.853274 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp"] Apr 16 22:17:10.857491 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:10.857469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:10.859548 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:10.859527 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 22:17:10.863413 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:10.863387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp"] Apr 16 22:17:11.018347 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.018496 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.018496 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2c4\" (UniqueName: \"kubernetes.io/projected/13165dd8-ed09-4fe5-a40c-f062d66b7989-kube-api-access-6z2c4\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.018496 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13165dd8-ed09-4fe5-a40c-f062d66b7989-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.018605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.018605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.018534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.119963 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.119893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.119984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2c4\" (UniqueName: \"kubernetes.io/projected/13165dd8-ed09-4fe5-a40c-f062d66b7989-kube-api-access-6z2c4\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13165dd8-ed09-4fe5-a40c-f062d66b7989-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120138 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120358 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120358 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120585 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.120683 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.120567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.122241 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.122213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/13165dd8-ed09-4fe5-a40c-f062d66b7989-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.122413 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.122396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13165dd8-ed09-4fe5-a40c-f062d66b7989-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.126679 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.126658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2c4\" (UniqueName: \"kubernetes.io/projected/13165dd8-ed09-4fe5-a40c-f062d66b7989-kube-api-access-6z2c4\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-52kbp\" (UID: \"13165dd8-ed09-4fe5-a40c-f062d66b7989\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.168966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.168938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:11.289318 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.289284 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp"] Apr 16 22:17:11.292364 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:17:11.292339 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13165dd8_ed09_4fe5_a40c_f062d66b7989.slice/crio-b3b727d4dcc8ae28d042904cd66c9788fa04fc1be5ca8d05b13f49864889934d WatchSource:0}: Error finding container b3b727d4dcc8ae28d042904cd66c9788fa04fc1be5ca8d05b13f49864889934d: Status 404 returned error can't find the container with id b3b727d4dcc8ae28d042904cd66c9788fa04fc1be5ca8d05b13f49864889934d Apr 16 22:17:11.397036 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.396933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" event={"ID":"13165dd8-ed09-4fe5-a40c-f062d66b7989","Type":"ContainerStarted","Data":"a179d4f7eb17d417d1c8c9031c9fea9b76a1d3a94b2dd09a81d774d82025b74f"} Apr 16 22:17:11.397036 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:11.396970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" event={"ID":"13165dd8-ed09-4fe5-a40c-f062d66b7989","Type":"ContainerStarted","Data":"b3b727d4dcc8ae28d042904cd66c9788fa04fc1be5ca8d05b13f49864889934d"} Apr 16 22:17:17.417953 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:17.417916 2575 generic.go:358] "Generic (PLEG): container finished" podID="13165dd8-ed09-4fe5-a40c-f062d66b7989" containerID="a179d4f7eb17d417d1c8c9031c9fea9b76a1d3a94b2dd09a81d774d82025b74f" exitCode=0 Apr 16 22:17:17.418363 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:17.417985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" event={"ID":"13165dd8-ed09-4fe5-a40c-f062d66b7989","Type":"ContainerDied","Data":"a179d4f7eb17d417d1c8c9031c9fea9b76a1d3a94b2dd09a81d774d82025b74f"} Apr 16 22:17:18.422440 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:18.422404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" event={"ID":"13165dd8-ed09-4fe5-a40c-f062d66b7989","Type":"ContainerStarted","Data":"f9c0fb80c628d658d51519eab3e6120aa953ef8696bfb5969c152c93f1b5cd5c"} Apr 16 22:17:18.422809 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:18.422621 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:18.439696 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:18.439652 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" podStartSLOduration=8.234995249 podStartE2EDuration="8.439639002s" podCreationTimestamp="2026-04-16 22:17:10 +0000 UTC" firstStartedPulling="2026-04-16 22:17:17.418604833 +0000 UTC m=+742.998107538" lastFinishedPulling="2026-04-16 22:17:17.623248581 +0000 UTC m=+743.202751291" observedRunningTime="2026-04-16 22:17:18.437589041 +0000 UTC m=+744.017091790" watchObservedRunningTime="2026-04-16 22:17:18.439639002 +0000 UTC m=+744.019141729" Apr 16 22:17:29.438966 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:29.438934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-52kbp" Apr 16 22:17:34.614127 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.614091 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7779b6944-b6nzj"] Apr 16 22:17:34.618375 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.618356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.623602 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.623576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7779b6944-b6nzj"] Apr 16 22:17:34.717660 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.717630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-tls-cert\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.717855 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.717699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5fp\" (UniqueName: \"kubernetes.io/projected/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-kube-api-access-7x5fp\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.818251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.818209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-tls-cert\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.818418 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.818321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5fp\" (UniqueName: \"kubernetes.io/projected/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-kube-api-access-7x5fp\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.820670 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.820641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-tls-cert\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.825167 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.825146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5fp\" (UniqueName: \"kubernetes.io/projected/a73cc017-03f5-4ad3-989d-3d2aeaf2b676-kube-api-access-7x5fp\") pod \"authorino-7779b6944-b6nzj\" (UID: \"a73cc017-03f5-4ad3-989d-3d2aeaf2b676\") " pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:34.929100 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:34.929042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7779b6944-b6nzj" Apr 16 22:17:35.046381 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:35.046355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7779b6944-b6nzj"] Apr 16 22:17:35.048387 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:17:35.048357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73cc017_03f5_4ad3_989d_3d2aeaf2b676.slice/crio-0b1aa2f5185072c6c96b7ba9c2086f6f2074e54a88383c1e08d7fa7e1100d3a4 WatchSource:0}: Error finding container 0b1aa2f5185072c6c96b7ba9c2086f6f2074e54a88383c1e08d7fa7e1100d3a4: Status 404 returned error can't find the container with id 0b1aa2f5185072c6c96b7ba9c2086f6f2074e54a88383c1e08d7fa7e1100d3a4 Apr 16 22:17:35.478408 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:35.478377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7779b6944-b6nzj" event={"ID":"a73cc017-03f5-4ad3-989d-3d2aeaf2b676","Type":"ContainerStarted","Data":"0b1aa2f5185072c6c96b7ba9c2086f6f2074e54a88383c1e08d7fa7e1100d3a4"} Apr 16 22:17:36.482773 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.482734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7779b6944-b6nzj" event={"ID":"a73cc017-03f5-4ad3-989d-3d2aeaf2b676","Type":"ContainerStarted","Data":"df154700833420d78453102a71c453cf5a8a947d0777e8b1c1aae04403a99091"} Apr 16 22:17:36.496183 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.496135 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7779b6944-b6nzj" podStartSLOduration=2.043989633 podStartE2EDuration="2.496118742s" podCreationTimestamp="2026-04-16 22:17:34 +0000 UTC" firstStartedPulling="2026-04-16 22:17:35.049766947 +0000 UTC m=+760.629269656" lastFinishedPulling="2026-04-16 22:17:35.501896056 +0000 UTC m=+761.081398765" observedRunningTime="2026-04-16 22:17:36.4956592 +0000 UTC m=+762.075161927" watchObservedRunningTime="2026-04-16 22:17:36.496118742 +0000 UTC m=+762.075621469" Apr 16 22:17:36.518093 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.518063 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:17:36.518265 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.518245 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-74cd789d9c-b74j5" podUID="2df57625-8419-43a2-b1d9-7588911763ce" containerName="authorino" containerID="cri-o://44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d" gracePeriod=30 Apr 16 22:17:36.758580 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.758555 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:17:36.838422 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.838393 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert\") pod \"2df57625-8419-43a2-b1d9-7588911763ce\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " Apr 16 22:17:36.838556 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.838484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmn4p\" (UniqueName: \"kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p\") pod \"2df57625-8419-43a2-b1d9-7588911763ce\" (UID: \"2df57625-8419-43a2-b1d9-7588911763ce\") " Apr 16 22:17:36.840393 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.840364 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p" (OuterVolumeSpecName: "kube-api-access-fmn4p") pod "2df57625-8419-43a2-b1d9-7588911763ce" (UID: "2df57625-8419-43a2-b1d9-7588911763ce"). InnerVolumeSpecName "kube-api-access-fmn4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:36.848069 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.848046 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "2df57625-8419-43a2-b1d9-7588911763ce" (UID: "2df57625-8419-43a2-b1d9-7588911763ce"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:36.939515 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.939490 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmn4p\" (UniqueName: \"kubernetes.io/projected/2df57625-8419-43a2-b1d9-7588911763ce-kube-api-access-fmn4p\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:17:36.939515 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:36.939514 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2df57625-8419-43a2-b1d9-7588911763ce-tls-cert\") on node \"ip-10-0-138-154.ec2.internal\" DevicePath \"\"" Apr 16 22:17:37.487137 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.487097 2575 generic.go:358] "Generic (PLEG): container finished" podID="2df57625-8419-43a2-b1d9-7588911763ce" containerID="44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d" exitCode=0 Apr 16 22:17:37.487581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.487133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74cd789d9c-b74j5" event={"ID":"2df57625-8419-43a2-b1d9-7588911763ce","Type":"ContainerDied","Data":"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d"} Apr 16 22:17:37.487581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.487156 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74cd789d9c-b74j5" Apr 16 22:17:37.487581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.487170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74cd789d9c-b74j5" event={"ID":"2df57625-8419-43a2-b1d9-7588911763ce","Type":"ContainerDied","Data":"154ad6e4b10a0d0f79fe11b8dbfc794d473d7a2e9c11076ddda6304bf4e45296"} Apr 16 22:17:37.487581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.487186 2575 scope.go:117] "RemoveContainer" containerID="44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d" Apr 16 22:17:37.495600 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.495581 2575 scope.go:117] "RemoveContainer" containerID="44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d" Apr 16 22:17:37.495862 ip-10-0-138-154 kubenswrapper[2575]: E0416 22:17:37.495843 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d\": container with ID starting with 44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d not found: ID does not exist" containerID="44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d" Apr 16 22:17:37.495913 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.495869 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d"} err="failed to get container status \"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d\": rpc error: code = NotFound desc = could not find container \"44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d\": container with ID starting with 44dca1db4ab5c252a7161e937484c8948ed59167e4787ae448fbf5019edfa18d not found: ID does not exist" Apr 16 22:17:37.502670 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.502644 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:17:37.506176 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:37.506157 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-74cd789d9c-b74j5"] Apr 16 22:17:39.013644 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:17:39.013611 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df57625-8419-43a2-b1d9-7588911763ce" path="/var/lib/kubelet/pods/2df57625-8419-43a2-b1d9-7588911763ce/volumes" Apr 16 22:19:54.962442 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:19:54.962356 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:19:54.964833 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:19:54.964812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:24:54.988932 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:24:54.988905 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:24:54.991395 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:24:54.991376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:29:55.016913 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:29:55.016881 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:29:55.019910 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:29:55.019888 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:34:55.041567 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:34:55.041536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:34:55.047781 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:34:55.047758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:39:54.725514 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:39:54.725474 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7779b6944-b6nzj_a73cc017-03f5-4ad3-989d-3d2aeaf2b676/authorino/0.log" Apr 16 22:39:55.067043 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:39:55.066998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:39:55.074598 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:39:55.074576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:39:59.195903 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:39:59.195869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-zhwr9_b352caf5-923a-4829-b55f-d85295c912c7/manager/0.log" Apr 16 22:40:00.543889 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:00.543859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7779b6944-b6nzj_a73cc017-03f5-4ad3-989d-3d2aeaf2b676/authorino/0.log" Apr 16 22:40:00.663020 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:00.662983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-sbm8z_590b3d5a-e3a4-4895-92d0-0fc383e05b3b/manager/0.log" Apr 16 22:40:01.908707 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:01.908675 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c9f6bcb5c-v8zxk_5bafe1f8-e2aa-417f-b252-6f62857af20d/kube-auth-proxy/0.log" Apr 16 22:40:02.714611 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:02.714576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-xb2pm_567b1c65-886a-4f97-ba7a-05a1ffc26e66/storage-initializer/0.log" Apr 16 22:40:02.721206 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:02.721182 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-xb2pm_567b1c65-886a-4f97-ba7a-05a1ffc26e66/main/0.log" Apr 16 22:40:02.831337 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:02.831315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-52kbp_13165dd8-ed09-4fe5-a40c-f062d66b7989/storage-initializer/0.log" Apr 16 22:40:02.838769 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:02.838747 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-52kbp_13165dd8-ed09-4fe5-a40c-f062d66b7989/main/0.log" Apr 16 22:40:09.742395 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:09.742368 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-24qxc_273e4bd6-9bb1-4c92-b9b0-206b2e8c7fca/global-pull-secret-syncer/0.log" Apr 16 22:40:09.901857 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:09.901813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7xr5k_7dd7ffd9-34ba-4ace-964e-e902616fd753/konnectivity-agent/0.log" Apr 16 22:40:10.000646 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:10.000548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-154.ec2.internal_be7bc173f8b93cf725d465cb6fdf2be8/haproxy/0.log" Apr 16 22:40:14.332416 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:14.332380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7779b6944-b6nzj_a73cc017-03f5-4ad3-989d-3d2aeaf2b676/authorino/0.log" Apr 16 22:40:14.357722 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:14.357699 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-sbm8z_590b3d5a-e3a4-4895-92d0-0fc383e05b3b/manager/0.log" Apr 16 22:40:15.846590 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.846516 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/alertmanager/0.log" Apr 16 22:40:15.868617 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.868589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/config-reloader/0.log" Apr 16 22:40:15.900192 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.900176 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/kube-rbac-proxy-web/0.log" Apr 16 22:40:15.922124 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.922097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/kube-rbac-proxy/0.log" Apr 16 22:40:15.943148 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.943127 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/kube-rbac-proxy-metric/0.log" Apr 16 22:40:15.969939 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.969920 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/prom-label-proxy/0.log" Apr 16 22:40:15.993144 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:15.993126 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3df593f4-d44b-4912-af6f-cc22bdce7c54/init-config-reloader/0.log" Apr 16 22:40:16.036255 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.036229 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-xbj2b_8aa498c8-4e70-44a5-8cf3-8c5794a14bc9/cluster-monitoring-operator/0.log" Apr 16 22:40:16.168448 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.168425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-698654f8b9-vltwc_09dba9e2-ce5f-46b8-a09f-8fa332e68991/metrics-server/0.log" Apr 16 22:40:16.343193 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.343170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/node-exporter/0.log" Apr 16 22:40:16.384514 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.384491 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/kube-rbac-proxy/0.log" Apr 16 22:40:16.407444 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.407387 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmr6l_d7c988dc-9643-4f55-9745-2403cd54fc4a/init-textfile/0.log" Apr 16 22:40:16.514612 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.514588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/kube-rbac-proxy-main/0.log" Apr 16 22:40:16.534235 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.534210 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/kube-rbac-proxy-self/0.log" Apr 16 22:40:16.554909 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.554886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lpjcl_576ba2b0-0acf-4938-bae7-06f509b251ae/openshift-state-metrics/0.log" Apr 16 22:40:16.608587 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.608561 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/prometheus/0.log" Apr 16 22:40:16.630922 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.630895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/config-reloader/0.log" Apr 16 22:40:16.654776 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.654749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/thanos-sidecar/0.log" Apr 16 22:40:16.680581 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.680509 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/kube-rbac-proxy-web/0.log" Apr 16 22:40:16.703101 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.703074 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/kube-rbac-proxy/0.log" Apr 16 22:40:16.728028 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.727986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/kube-rbac-proxy-thanos/0.log" Apr 16 22:40:16.764911 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.764886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_45f95683-e1dc-42d4-8d51-f6135f368dc1/init-config-reloader/0.log" Apr 16 22:40:16.842135 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.842112 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-4trnb_481c641d-358b-4737-befb-5b91970311c7/prometheus-operator-admission-webhook/0.log" Apr 16 22:40:16.872394 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.872370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-577757d8d8-rdn4x_fab58a82-0471-4bb7-bc78-efbd4adc3dea/telemeter-client/0.log" Apr 16 22:40:16.893360 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.893338 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-577757d8d8-rdn4x_fab58a82-0471-4bb7-bc78-efbd4adc3dea/reload/0.log" Apr 16 22:40:16.914297 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.914270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-577757d8d8-rdn4x_fab58a82-0471-4bb7-bc78-efbd4adc3dea/kube-rbac-proxy/0.log" Apr 16 22:40:16.941437 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.941365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/thanos-query/0.log" Apr 16 22:40:16.962640 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.962618 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-web/0.log" Apr 16 22:40:16.988732 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:16.988709 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy/0.log" Apr 16 22:40:17.011456 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:17.011427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/prom-label-proxy/0.log" Apr 16 22:40:17.032605 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:17.032558 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-rules/0.log" Apr 16 22:40:17.055274 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:17.055250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54579f7659-wm42w_cc1e439f-b750-4821-90f6-eeb916e4509b/kube-rbac-proxy-metrics/0.log" Apr 16 22:40:18.418094 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.418064 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5"] Apr 16 22:40:18.418457 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.418447 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2df57625-8419-43a2-b1d9-7588911763ce" containerName="authorino" Apr 16 22:40:18.418457 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.418457 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df57625-8419-43a2-b1d9-7588911763ce" containerName="authorino" Apr 16 22:40:18.418541 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.418530 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2df57625-8419-43a2-b1d9-7588911763ce" containerName="authorino" Apr 16 22:40:18.421688 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.421667 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.423915 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.423895 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"kube-root-ca.crt\"" Apr 16 22:40:18.424670 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.424654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"openshift-service-ca.crt\"" Apr 16 22:40:18.424724 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.424660 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wq7jn\"/\"default-dockercfg-9fxhp\"" Apr 16 22:40:18.428357 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.428333 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5"] Apr 16 22:40:18.515751 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.515720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-sys\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.515919 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.515776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-lib-modules\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.515919 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.515820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7p9\" (UniqueName: \"kubernetes.io/projected/7ee56669-a28b-46cf-8df8-0aed1c9396f9-kube-api-access-wn7p9\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.515919 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.515873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-podres\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.516124 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.515954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-proc\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.616961 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.616918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7p9\" (UniqueName: \"kubernetes.io/projected/7ee56669-a28b-46cf-8df8-0aed1c9396f9-kube-api-access-wn7p9\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617132 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.616975 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-podres\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617132 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-proc\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617132 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-sys\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617132 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-lib-modules\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617328 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-podres\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617328 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-proc\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617328 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-sys\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.617328 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.617240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ee56669-a28b-46cf-8df8-0aed1c9396f9-lib-modules\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.625094 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.625072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7p9\" (UniqueName: \"kubernetes.io/projected/7ee56669-a28b-46cf-8df8-0aed1c9396f9-kube-api-access-wn7p9\") pod \"perf-node-gather-daemonset-kkzt5\" (UID: \"7ee56669-a28b-46cf-8df8-0aed1c9396f9\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.732182 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.732119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:18.851058 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.850902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5"] Apr 16 22:40:18.853356 ip-10-0-138-154 kubenswrapper[2575]: W0416 22:40:18.853328 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7ee56669_a28b_46cf_8df8_0aed1c9396f9.slice/crio-792ce64df934378c622909142d7dbc52f3b2f3cc38d8e93af24193efede30a94 WatchSource:0}: Error finding container 792ce64df934378c622909142d7dbc52f3b2f3cc38d8e93af24193efede30a94: Status 404 returned error can't find the container with id 792ce64df934378c622909142d7dbc52f3b2f3cc38d8e93af24193efede30a94 Apr 16 22:40:18.854944 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:18.854927 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:40:19.165786 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:19.165759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-9vf4q_3672d731-f084-4334-a03d-3a333467d313/download-server/0.log" Apr 16 22:40:19.199762 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:19.199726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" event={"ID":"7ee56669-a28b-46cf-8df8-0aed1c9396f9","Type":"ContainerStarted","Data":"0e688e415643a10a9c6b183e0a5679e2477cfcba3019c6d3e93c992ad229f918"} Apr 16 22:40:19.199905 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:19.199768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" event={"ID":"7ee56669-a28b-46cf-8df8-0aed1c9396f9","Type":"ContainerStarted","Data":"792ce64df934378c622909142d7dbc52f3b2f3cc38d8e93af24193efede30a94"} Apr 16 22:40:19.199905 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:19.199861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:19.216529 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:19.216486 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" podStartSLOduration=1.216476265 podStartE2EDuration="1.216476265s" podCreationTimestamp="2026-04-16 22:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:19.215736501 +0000 UTC m=+2124.795239229" watchObservedRunningTime="2026-04-16 22:40:19.216476265 +0000 UTC m=+2124.795978992" Apr 16 22:40:20.471245 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:20.471221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sc92n_18fc6a08-d922-4dd4-bac0-76c707d36daa/dns/0.log" Apr 16 22:40:20.489720 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:20.489691 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sc92n_18fc6a08-d922-4dd4-bac0-76c707d36daa/kube-rbac-proxy/0.log" Apr 16 22:40:20.591967 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:20.591938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w2p9p_4ad8b845-f3d7-4afe-a815-787bb7f69564/dns-node-resolver/0.log" Apr 16 22:40:21.097508 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:21.097477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ffv5n_4a1cd3c5-4d03-444e-82c3-29cdb850d6cf/node-ca/0.log" Apr 16 22:40:22.105340 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:22.105315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c9f6bcb5c-v8zxk_5bafe1f8-e2aa-417f-b252-6f62857af20d/kube-auth-proxy/0.log" Apr 16 22:40:22.764078 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:22.764052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fkq5z_3cb4878c-59b3-48d2-8c2e-646f1605bf4e/serve-healthcheck-canary/0.log" Apr 16 22:40:23.356446 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:23.356412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjq9f_912f768c-37c1-4dce-b9e2-2d8ce1263ffc/kube-rbac-proxy/0.log" Apr 16 22:40:23.374928 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:23.374901 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjq9f_912f768c-37c1-4dce-b9e2-2d8ce1263ffc/exporter/0.log" Apr 16 22:40:23.393276 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:23.393256 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjq9f_912f768c-37c1-4dce-b9e2-2d8ce1263ffc/extractor/0.log" Apr 16 22:40:25.211990 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:25.211961 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-kkzt5" Apr 16 22:40:25.494181 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:25.494092 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-zhwr9_b352caf5-923a-4829-b55f-d85295c912c7/manager/0.log" Apr 16 22:40:26.609251 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:26.609225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fd7d9b88b-9qk9l_dfb99abe-c737-48ae-b4f5-e9a63e81b883/manager/0.log" Apr 16 22:40:31.182890 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:31.182857 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zp9rp_c5ff0589-676f-413b-9ec7-397666bad579/migrator/0.log" Apr 16 22:40:31.201900 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:31.201870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zp9rp_c5ff0589-676f-413b-9ec7-397666bad579/graceful-termination/0.log" Apr 16 22:40:32.678335 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.678255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/kube-multus-additional-cni-plugins/0.log" Apr 16 22:40:32.698519 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.698496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/egress-router-binary-copy/0.log" Apr 16 22:40:32.719154 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.719134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/cni-plugins/0.log" Apr 16 22:40:32.739920 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.739904 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/bond-cni-plugin/0.log" Apr 16 22:40:32.761260 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.761245 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/routeoverride-cni/0.log" Apr 16 22:40:32.779641 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.779624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/whereabouts-cni-bincopy/0.log" Apr 16 22:40:32.799019 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:32.798988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7p5pj_f30d322b-dffa-40ed-b571-c4015d6c53dd/whereabouts-cni/0.log" Apr 16 22:40:33.012252 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:33.012225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vj2qt_ded33a78-e95e-4a1a-97d0-f06ac24a881a/kube-multus/0.log" Apr 16 22:40:33.138759 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:33.138710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wqzqv_ef0e5fb7-90e1-4234-a572-2eeac57ba8d9/network-metrics-daemon/0.log" Apr 16 22:40:33.155693 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:33.155670 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wqzqv_ef0e5fb7-90e1-4234-a572-2eeac57ba8d9/kube-rbac-proxy/0.log" Apr 16 22:40:34.335488 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.335460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-controller/0.log" Apr 16 22:40:34.351564 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.351543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/0.log" Apr 16 22:40:34.360807 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.360790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovn-acl-logging/1.log" Apr 16 22:40:34.379600 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.379584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/kube-rbac-proxy-node/0.log" Apr 16 22:40:34.402435 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.402415 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:40:34.420127 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.420110 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/northd/0.log" Apr 16 22:40:34.443111 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.443095 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/nbdb/0.log" Apr 16 22:40:34.470567 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.470549 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/sbdb/0.log" Apr 16 22:40:34.583844 ip-10-0-138-154 kubenswrapper[2575]: I0416 22:40:34.583820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4tdqb_368f7f53-a095-41a5-b3f1-ce5057f3c97b/ovnkube-controller/0.log"