Apr 17 11:27:47.867091 ip-10-0-130-210 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:27:47.867102 ip-10-0-130-210 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:27:47.867110 ip-10-0-130-210 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:27:47.867342 ip-10-0-130-210 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:27:59.078462 ip-10-0-130-210 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:27:59.078477 ip-10-0-130-210 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 30271bb0e2824bad8c6a6efc14ccd694 -- Apr 17 11:30:26.342883 ip-10-0-130-210 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:30:26.820388 ip-10-0-130-210 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:26.820388 ip-10-0-130-210 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:30:26.820388 ip-10-0-130-210 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:26.820388 ip-10-0-130-210 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:30:26.820388 ip-10-0-130-210 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:26.821121 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.820993 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:30:26.826790 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826761 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:26.826790 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826785 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:26.826790 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826789 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:26.826790 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826793 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826796 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826800 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826803 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826806 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826809 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826812 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826814 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826817 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826819 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826822 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826825 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826828 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826830 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826833 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826835 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826837 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826840 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826843 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:26.826922 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826845 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826848 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826850 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826853 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826855 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826858 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826860 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826863 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826865 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826869 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826871 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826874 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826876 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826879 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826882 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826886 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826889 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826892 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826894 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826897 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:26.827386 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826900 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826903 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826905 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826908 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826911 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826913 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826917 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826922 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826926 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826930 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826933 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826936 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826939 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826941 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826944 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826947 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826950 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826952 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826955 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826958 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:26.827899 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826960 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826963 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826966 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826969 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826972 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826975 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826978 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826981 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826983 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826986 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826989 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826992 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826994 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.826996 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827009 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827012 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827014 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827017 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827020 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827022 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:26.828420 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827025 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827030 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827034 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827038 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827590 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827603 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827606 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827610 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827613 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827616 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827619 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827622 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827625 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827627 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827630 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827632 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827635 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827638 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827641 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827643 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:26.828923 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827646 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827649 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827651 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827654 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827656 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827659 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827662 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827666 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827677 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827680 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827682 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827685 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827688 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827690 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827692 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827695 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827698 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827701 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827704 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827707 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:26.829464 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827709 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827711 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827714 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827717 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827720 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827723 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827726 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827728 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827731 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827734 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827736 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827739 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827741 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827743 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827746 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827748 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827751 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827753 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827756 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827758 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:26.829993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827761 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827769 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827772 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827774 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827777 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827779 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827782 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827784 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827788 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827791 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827793 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827796 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827799 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827801 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827804 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827811 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827816 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827820 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827823 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:26.830490 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827826 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827829 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827832 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827834 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827837 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827840 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827842 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827845 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827847 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827850 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.827852 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829376 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829392 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829402 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829408 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829415 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829419 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829424 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829429 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829433 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829436 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:30:26.830983 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829440 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829444 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829448 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829451 2570 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829454 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829457 2570 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829461 2570 flags.go:64] FLAG: --cloud-config="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829464 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829467 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829476 2570 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829479 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829483 2570 flags.go:64] FLAG: --config-dir="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829486 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829489 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829494 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829497 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829500 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829503 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829507 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829510 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829528 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829534 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829539 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829545 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829548 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:30:26.831510 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829552 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829555 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829560 2570 flags.go:64] FLAG: --enable-server="true" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829563 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829569 2570 flags.go:64] FLAG: --event-burst="100" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829572 2570 flags.go:64] FLAG: --event-qps="50" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829575 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829579 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829583 2570 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829587 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829590 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829593 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829597 2570 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829600 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829603 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829606 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829609 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829613 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829616 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829619 2570 flags.go:64] FLAG: --feature-gates="" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829624 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829627 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829630 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829634 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829637 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:30:26.832149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829640 2570 flags.go:64] FLAG: --help="false" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829643 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-130-210.ec2.internal" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829647 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829650 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829653 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829657 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829660 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829664 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829667 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829670 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829673 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829677 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829680 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829683 2570 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829686 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829690 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829694 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829696 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829699 2570 flags.go:64] FLAG: --lock-file="" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829702 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829705 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829709 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829714 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:30:26.832773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829717 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829720 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829723 2570 flags.go:64] FLAG: --logging-format="text" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829726 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829730 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829733 2570 flags.go:64] FLAG: --manifest-url="" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829736 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829741 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829744 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829748 2570 flags.go:64] FLAG: --max-pods="110" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829751 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829754 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829758 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829761 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829764 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829767 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829770 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829781 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829785 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829788 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829792 2570 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829795 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829801 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829804 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:30:26.833364 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829807 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829810 2570 flags.go:64] FLAG: --port="10250" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829813 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829817 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-025ec1a2df7b470d0" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829820 2570 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829823 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829826 2570 flags.go:64] FLAG: --register-node="true" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829829 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829832 2570 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829836 2570 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829839 2570 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829842 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829845 2570 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829849 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829852 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829856 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829859 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829861 2570 flags.go:64] FLAG: --runonce="false" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829864 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829868 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829871 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829874 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829877 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829880 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829884 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829887 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:30:26.833987 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829894 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829897 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829900 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829904 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829907 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829911 2570 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829914 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829920 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829923 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829926 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829931 2570 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829934 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829937 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829939 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829943 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829945 2570 flags.go:64] FLAG: --v="2" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829950 2570 flags.go:64] FLAG: --version="false" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829955 2570 flags.go:64] FLAG: --vmodule="" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829960 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.829963 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830092 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830097 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830101 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830103 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:26.834624 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830106 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830109 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830112 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830115 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830120 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830123 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830127 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830130 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830134 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830137 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830140 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830143 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830146 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830149 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830152 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830154 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830157 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830160 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830162 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:26.835206 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830165 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830167 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830170 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830173 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830175 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830178 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830180 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830183 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830185 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830188 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830191 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830194 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830197 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830199 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830202 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830205 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830207 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830210 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830214 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830218 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:26.835717 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830220 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830224 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830227 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830230 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830233 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830235 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830239 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830241 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830244 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830246 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830249 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830252 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830254 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830257 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830260 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830263 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830265 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830268 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830270 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830273 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:26.836215 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830276 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830278 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830281 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830283 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830288 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830291 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830293 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830296 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830299 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830301 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830304 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830306 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830309 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830314 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830317 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830320 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830322 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830325 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830328 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830331 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:26.836765 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830334 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:26.837267 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830336 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:26.837267 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.830339 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:26.837267 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.831025 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:26.839498 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.839466 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:30:26.839498 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.839493 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839571 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839578 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839581 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839585 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839588 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839591 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839593 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839596 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839599 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839601 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839604 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839606 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839609 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839612 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839615 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839618 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839622 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839627 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:26.839671 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839631 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839634 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839637 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839640 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839643 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839646 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839649 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839651 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839654 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839657 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839660 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839662 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839665 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839667 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839670 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839675 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839678 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839681 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839684 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839687 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:26.840153 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839689 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839692 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839695 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839697 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839701 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839703 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839705 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839708 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839711 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839714 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839717 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839719 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839722 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839725 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839728 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839731 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839733 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839738 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839741 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839745 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:26.840669 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839747 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839750 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839752 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839755 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839758 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839760 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839763 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839766 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839769 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839771 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839774 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839777 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839779 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839782 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839784 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839787 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839790 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839792 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839796 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:26.841173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839799 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839801 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839804 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839806 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839809 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839812 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839814 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839817 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839820 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.839826 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839942 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839946 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839949 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839952 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839955 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:26.841665 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839957 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839961 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839965 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839968 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839970 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839973 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839976 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839979 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839982 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839985 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839989 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839992 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839995 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.839998 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840001 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840004 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840006 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840009 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840012 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:26.842043 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840015 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840017 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840020 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840022 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840025 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840028 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840030 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840033 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840035 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840038 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840040 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840043 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840046 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840048 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840051 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840054 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840056 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840059 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840062 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:26.842532 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840065 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840067 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840069 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840072 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840075 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840077 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840080 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840082 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840085 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840088 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840090 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840093 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840099 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840102 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840105 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840107 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840110 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840112 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840115 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840117 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:26.843036 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840120 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840122 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840125 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840127 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840129 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840132 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840135 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840138 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840141 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840143 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840146 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840149 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840151 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840154 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840157 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840159 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840162 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840165 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840168 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840170 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:26.843591 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840173 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:26.844084 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840175 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:26.844084 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:26.840178 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:26.844084 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.840183 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:26.844084 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.840979 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:30:26.847200 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.847178 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:30:26.848139 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.848121 2570 server.go:1019] "Starting client certificate rotation" Apr 17 11:30:26.848254 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.848236 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:30:26.848309 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.848278 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:30:26.877490 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.877466 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:30:26.880701 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.880672 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:30:26.899476 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.899449 2570 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:30:26.905373 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.905343 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:30:26.906367 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.906354 2570 log.go:25] "Validated CRI v1 image API" Apr 17 11:30:26.908221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.908191 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:30:26.912553 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.912507 2570 fs.go:135] Filesystem UUIDs: map[7157067f-910d-441a-95bd-46c4505d51b1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a1453838-5cdc-4587-890c-bb54306d5f09:/dev/nvme0n1p3] Apr 17 11:30:26.912553 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.912549 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:30:26.917757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.917639 2570 manager.go:217] Machine: {Timestamp:2026-04-17 11:30:26.916308615 +0000 UTC m=+0.444885856 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101021 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b717ca2dc8b1b30fed682b457b673 SystemUUID:ec2b717c-a2dc-8b1b-30fe-d682b457b673 BootID:30271bb0-e282-4bad-8c6a-6efc14ccd694 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:57:51:3b:ee:1d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:57:51:3b:ee:1d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:28:cb:6f:b7:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:30:26.917757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.917749 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:30:26.917905 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.917888 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:30:26.919564 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919507 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:30:26.919767 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919565 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-210.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:30:26.919863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919786 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:30:26.919863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919800 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:30:26.919863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919820 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:30:26.919863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.919843 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:30:26.921227 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.921210 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:30:26.921376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.921364 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:30:26.924580 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.924565 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:30:26.924646 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.924590 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:30:26.924646 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.924607 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:30:26.924646 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.924621 2570 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:30:26.924775 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.924649 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:30:26.925815 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.925799 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:30:26.925888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.925823 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:30:26.927678 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.927653 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b59zh" Apr 17 11:30:26.928893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.928875 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:30:26.931083 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.931067 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:30:26.932546 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932505 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:30:26.932546 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932538 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932548 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932557 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932565 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932572 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932577 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932583 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932591 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932598 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:30:26.932610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932607 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:30:26.932841 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.932617 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:30:26.933485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.933469 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:30:26.933485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.933485 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:30:26.935304 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.935273 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b59zh" Apr 17 11:30:26.935816 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.935757 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:30:26.936146 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.936124 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-210.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:30:26.938187 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.938038 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:30:26.938271 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.938219 2570 server.go:1295] "Started kubelet" Apr 17 11:30:26.938348 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.938310 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:30:26.938430 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.938382 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:30:26.938484 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.938460 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:30:26.939236 ip-10-0-130-210 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:30:26.939561 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.939543 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:30:26.941185 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.941171 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:30:26.945327 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.945302 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:30:26.945480 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.945464 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:30:26.946492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.946032 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:30:26.946492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.946056 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:30:26.946633 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.946504 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:30:26.946633 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.946200 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:26.946633 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.946606 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:30:26.946633 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.946616 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:30:26.947047 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947025 2570 factory.go:153] Registering CRI-O factory Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947055 2570 factory.go:223] Registration of the crio container factory successfully Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947107 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947117 2570 factory.go:55] Registering systemd factory Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947124 2570 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947145 2570 factory.go:103] Registering Raw factory Apr 17 11:30:26.947189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947160 2570 manager.go:1196] Started watching for new ooms in manager Apr 17 11:30:26.947980 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.947961 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:26.948384 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.948366 2570 manager.go:319] Starting recovery of all containers Apr 17 11:30:26.950185 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.949891 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:30:26.950370 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.950331 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-210.ec2.internal" not found Apr 17 11:30:26.950426 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.950368 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-210.ec2.internal\" not found" node="ip-10-0-130-210.ec2.internal" Apr 17 11:30:26.959279 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.959256 2570 manager.go:324] Recovery completed Apr 17 11:30:26.961308 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:26.961279 2570 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 11:30:26.964487 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.964472 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:26.965265 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.965247 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-210.ec2.internal" not found Apr 17 11:30:26.967157 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967134 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:26.967236 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967162 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:26.967236 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967174 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:26.967775 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967762 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:30:26.967825 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967775 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:30:26.967825 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.967793 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:30:26.970826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.970812 2570 policy_none.go:49] "None policy: Start" Apr 17 11:30:26.970889 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.970830 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:30:26.970889 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:26.970841 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.013328 2570 manager.go:341] "Starting Device Plugin manager" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.013464 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.013480 2570 server.go:85] "Starting device plugin registration server" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.013814 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.013828 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.013924 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.014021 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.014036 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.014611 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:30:27.016933 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.014650 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.022047 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.022023 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-210.ec2.internal" not found Apr 17 11:30:27.086189 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.086101 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:30:27.087405 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.087380 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:30:27.087405 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.087413 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:30:27.087605 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.087435 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:30:27.087605 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.087442 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:30:27.087605 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.087474 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:30:27.090004 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.089981 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:27.113982 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.113952 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:27.115182 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.115164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:27.115243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.115198 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:27.115243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.115209 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:27.115243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.115235 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.123385 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.123358 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.123469 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.123390 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-210.ec2.internal\": node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.142056 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.142024 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.188388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.188356 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal"] Apr 17 11:30:27.188555 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.188444 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:27.191094 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.191074 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:27.191174 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.191107 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:27.191174 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.191119 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:27.192394 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.192379 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:27.192577 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.192561 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.192621 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.192597 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:27.193424 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193408 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:27.193544 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193436 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:27.193544 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193409 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:27.193544 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193447 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:27.193544 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193459 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:27.193544 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.193469 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:27.194618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.194604 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.194662 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.194632 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:27.196143 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.196126 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:27.196217 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.196153 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:27.196217 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.196166 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:27.211206 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.211177 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-210.ec2.internal\" not found" node="ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.215430 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.215412 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-210.ec2.internal\" not found" node="ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.242323 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.242288 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.248670 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.248645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.248746 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.248678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.248746 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.248697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45cba5abf7cd7b00c2f9306a991cc4cf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-210.ec2.internal\" (UID: \"45cba5abf7cd7b00c2f9306a991cc4cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.343380 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.343312 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.349803 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.349865 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349813 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.349865 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349831 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45cba5abf7cd7b00c2f9306a991cc4cf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-210.ec2.internal\" (UID: \"45cba5abf7cd7b00c2f9306a991cc4cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.349930 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45cba5abf7cd7b00c2f9306a991cc4cf-config\") pod \"kube-apiserver-proxy-ip-10-0-130-210.ec2.internal\" (UID: \"45cba5abf7cd7b00c2f9306a991cc4cf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.349930 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.349930 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.349892 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eaf9e62be8566fb5d3c6a886f76dde22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal\" (UID: \"eaf9e62be8566fb5d3c6a886f76dde22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.444249 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.444198 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.513785 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.513754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.517830 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.517804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:27.545293 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.545261 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.646302 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.646203 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.746672 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.746635 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.846774 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.846739 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.847649 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.847635 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:30:27.847749 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.847630 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:27.847805 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.847746 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:30:27.847805 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.847749 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:30:27.938141 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.938045 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:25:26 +0000 UTC" deadline="2027-11-04 23:56:06.520345792 +0000 UTC" Apr 17 11:30:27.938141 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.938083 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13596h25m38.582265394s" Apr 17 11:30:27.946239 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.946208 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:30:27.947319 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:27.947300 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:27.955623 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.955593 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:30:27.979640 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.979614 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p2kw5" Apr 17 11:30:27.987441 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:27.987403 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p2kw5" Apr 17 11:30:28.047768 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.047736 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:28.074616 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:28.074572 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf9e62be8566fb5d3c6a886f76dde22.slice/crio-ec573b0183d37a5a52d02c4b95212e1edd6ba58b0d5fea62c6cfd1978963ab43 WatchSource:0}: Error finding container ec573b0183d37a5a52d02c4b95212e1edd6ba58b0d5fea62c6cfd1978963ab43: Status 404 returned error can't find the container with id ec573b0183d37a5a52d02c4b95212e1edd6ba58b0d5fea62c6cfd1978963ab43 Apr 17 11:30:28.074849 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:28.074827 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45cba5abf7cd7b00c2f9306a991cc4cf.slice/crio-c8130bed40a198dd383a6bf08ab319a7ac787f6d8eeb77111d60362d02b20b49 WatchSource:0}: Error finding container c8130bed40a198dd383a6bf08ab319a7ac787f6d8eeb77111d60362d02b20b49: Status 404 returned error can't find the container with id c8130bed40a198dd383a6bf08ab319a7ac787f6d8eeb77111d60362d02b20b49 Apr 17 11:30:28.081470 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.081448 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:30:28.090900 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.090855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" event={"ID":"eaf9e62be8566fb5d3c6a886f76dde22","Type":"ContainerStarted","Data":"ec573b0183d37a5a52d02c4b95212e1edd6ba58b0d5fea62c6cfd1978963ab43"} Apr 17 11:30:28.091802 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.091779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" event={"ID":"45cba5abf7cd7b00c2f9306a991cc4cf","Type":"ContainerStarted","Data":"c8130bed40a198dd383a6bf08ab319a7ac787f6d8eeb77111d60362d02b20b49"} Apr 17 11:30:28.148169 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.148134 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:28.248721 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.248648 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:28.349292 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.349255 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:28.450320 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.450280 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-210.ec2.internal\" not found" Apr 17 11:30:28.484718 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.484684 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:28.546493 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.546405 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" Apr 17 11:30:28.557147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.557116 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:30:28.558323 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.558300 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" Apr 17 11:30:28.566578 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.566547 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:30:28.926685 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.926597 2570 apiserver.go:52] "Watching apiserver" Apr 17 11:30:28.935220 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.935186 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:30:28.935654 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.935630 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw","openshift-dns/node-resolver-7vxqv","openshift-image-registry/node-ca-ts68p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal","openshift-multus/multus-mpm2g","openshift-network-operator/iptables-alerter-xhfhs","openshift-ovn-kubernetes/ovnkube-node-9qrz2","kube-system/konnectivity-agent-hlx77","kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal","openshift-cluster-node-tuning-operator/tuned-gltrt","openshift-multus/multus-additional-cni-plugins-q2phv","openshift-multus/network-metrics-daemon-z52nx","openshift-network-diagnostics/network-check-target-5r2k5"] Apr 17 11:30:28.938613 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.938593 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.940787 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.940735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.940787 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.940751 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9b2wx\"" Apr 17 11:30:28.940976 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.940735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.940976 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.940738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:28.942908 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.942747 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ptsrr\"" Apr 17 11:30:28.942908 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.942804 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.942908 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.942869 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.942908 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.942894 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:28.944673 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.944654 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:30:28.944799 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.944658 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.944799 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.944710 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5525m\"" Apr 17 11:30:28.944799 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.944716 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.945187 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.945103 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.946991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.946951 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:30:28.947289 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.947266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:28.947464 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.947427 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.947558 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.947464 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hw8ht\"" Apr 17 11:30:28.947755 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.947736 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:30:28.947906 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.947857 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.949017 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.949000 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.949314 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.949293 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.949428 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.949374 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:30:28.949597 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.949582 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z9q6w\"" Apr 17 11:30:28.949944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.949923 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:28.951901 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.951880 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.951991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.951907 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:30:28.952093 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:30:28.952146 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952126 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.952146 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952139 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:30:28.952237 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952190 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:30:28.952295 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5z96q\"" Apr 17 11:30:28.952608 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.952580 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:28.954639 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.954583 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:30:28.954639 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.954590 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:30:28.954781 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.954659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j7bcj\"" Apr 17 11:30:28.954988 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.954965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:28.956889 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.956869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:30:28.956984 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.956964 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qzfrx\"" Apr 17 11:30:28.956984 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.956972 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:30:28.957090 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957034 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:30:28.957231 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957206 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-conf\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.957296 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-sys\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.957296 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57cb4093-bdc6-4637-9afc-7364349a96d4-hosts-file\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:28.957296 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57cb4093-bdc6-4637-9afc-7364349a96d4-tmp-dir\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:28.957431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957315 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkd9\" (UniqueName: \"kubernetes.io/projected/57cb4093-bdc6-4637-9afc-7364349a96d4-kube-api-access-4bkd9\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:28.957431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-netns\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrsnm\" (UniqueName: \"kubernetes.io/projected/fa896c93-030e-48d4-afe6-575b621eca31-kube-api-access-mrsnm\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957386 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.957431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-var-lib-kubelet\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-os-release\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-cni-binary-copy\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957559 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-multus\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-conf-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-multus-certs\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957636 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec258cb-b4a0-425e-b582-b392c2becdfe-host-slash\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:28.957684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysconfig\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-run\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957764 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-host\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957794 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-socket-dir-parent\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-bin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957854 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec258cb-b4a0-425e-b582-b392c2becdfe-iptables-alerter-script\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-modprobe-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-lib-modules\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-system-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.957984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-multus-daemon-config\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-etc-kubernetes\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958046 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28p8d\" (UniqueName: \"kubernetes.io/projected/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-kube-api-access-28p8d\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-cnibin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-tuned\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-tmp\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-kubelet\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvx2\" (UniqueName: \"kubernetes.io/projected/bec258cb-b4a0-425e-b582-b392c2becdfe-kube-api-access-mvvx2\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswr7\" (UniqueName: \"kubernetes.io/projected/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-kube-api-access-zswr7\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-serviceca\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958332 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-systemd\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-kubernetes\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:28.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-host\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:28.959029 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-k8s-cni-cncf-io\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.959029 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.958493 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-hostroot\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:28.959348 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.959329 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:30:28.959432 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.959358 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:30:28.959432 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.959403 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z79m4\"" Apr 17 11:30:28.959843 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.959827 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:28.959928 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.959911 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:28.962216 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.962198 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:28.962308 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:28.962252 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:28.988195 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.988158 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:25:27 +0000 UTC" deadline="2027-09-21 02:11:11.822584872 +0000 UTC" Apr 17 11:30:28.988195 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:28.988190 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12518h40m42.834397645s" Apr 17 11:30:29.048033 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.047997 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:30:29.059625 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-kubernetes\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-k8s-cni-cncf-io\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj88w\" (UniqueName: \"kubernetes.io/projected/343340da-6202-4b41-8b3d-4e0c0f72ecb6-kube-api-access-dj88w\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-os-release\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-conf\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-kubernetes\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-k8s-cni-cncf-io\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.059800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57cb4093-bdc6-4637-9afc-7364349a96d4-hosts-file\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/57cb4093-bdc6-4637-9afc-7364349a96d4-hosts-file\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57cb4093-bdc6-4637-9afc-7364349a96d4-tmp-dir\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-conf\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrsnm\" (UniqueName: \"kubernetes.io/projected/fa896c93-030e-48d4-afe6-575b621eca31-kube-api-access-mrsnm\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.059998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-var-lib-kubelet\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-os-release\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-cni-binary-copy\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-multus\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-conf-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-os-release\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-var-lib-kubelet\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysctl-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-multus\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-config\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-conf-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060245 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57cb4093-bdc6-4637-9afc-7364349a96d4-tmp-dir\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysconfig\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-run\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-host\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-run\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-socket-dir-parent\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-sysconfig\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-host\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-bin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-socket-dir-parent\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.060631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060405 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-cni-bin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-systemd-units\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060446 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78932d63-d2fd-4c01-8666-7f65f21faaac-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-system-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060544 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-netns\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33c1ecda-fae8-404c-a67f-f189a105cd44-konnectivity-ca\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-system-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-tuned\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060659 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvx2\" (UniqueName: \"kubernetes.io/projected/bec258cb-b4a0-425e-b582-b392c2becdfe-kube-api-access-mvvx2\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-node-log\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-cni-binary-copy\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33c1ecda-fae8-404c-a67f-f189a105cd44-agent-certs\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-registration-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjnh\" (UniqueName: \"kubernetes.io/projected/bc493885-90a9-4fcc-9331-806c0d60be7d-kube-api-access-dkjnh\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.061393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cnibin\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-serviceca\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060945 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.060984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-slash\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061040 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-system-cni-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-systemd\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-systemd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061120 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-systemd\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-host\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-hostroot\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-sys\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-host\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061228 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-hostroot\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkd9\" (UniqueName: \"kubernetes.io/projected/57cb4093-bdc6-4637-9afc-7364349a96d4-kube-api-access-4bkd9\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-sys\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-netns\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061324 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-serviceca\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-var-lib-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-netns\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-etc-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-socket-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-multus-certs\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-run-multus-certs\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec258cb-b4a0-425e-b582-b392c2becdfe-host-slash\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061603 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-bin\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-env-overrides\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec258cb-b4a0-425e-b582-b392c2becdfe-host-slash\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec258cb-b4a0-425e-b582-b392c2becdfe-iptables-alerter-script\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-modprobe-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-lib-modules\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.062973 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-multus-daemon-config\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061782 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-etc-kubernetes\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-device-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061826 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfnj\" (UniqueName: \"kubernetes.io/projected/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-kube-api-access-shfnj\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-lib-modules\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28p8d\" (UniqueName: \"kubernetes.io/projected/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-kube-api-access-28p8d\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-etc-kubernetes\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-modprobe-d\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.061905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-cnibin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-tmp\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-cnibin\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062078 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-kubelet\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-host-var-lib-kubelet\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zswr7\" (UniqueName: \"kubernetes.io/projected/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-kube-api-access-zswr7\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec258cb-b4a0-425e-b582-b392c2becdfe-iptables-alerter-script\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.063675 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-log-socket\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzwnb\" (UniqueName: \"kubernetes.io/projected/78932d63-d2fd-4c01-8666-7f65f21faaac-kube-api-access-bzwnb\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa896c93-030e-48d4-afe6-575b621eca31-multus-daemon-config\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-sys-fs\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062296 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa896c93-030e-48d4-afe6-575b621eca31-multus-cni-dir\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-kubelet\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-ovn\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-netd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.062424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-script-lib\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.064409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.064205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-etc-tuned\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.064909 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.064463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-tmp\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.069777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.069745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28p8d\" (UniqueName: \"kubernetes.io/projected/3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4-kube-api-access-28p8d\") pod \"node-ca-ts68p\" (UID: \"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4\") " pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.069905 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.069743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkd9\" (UniqueName: \"kubernetes.io/projected/57cb4093-bdc6-4637-9afc-7364349a96d4-kube-api-access-4bkd9\") pod \"node-resolver-7vxqv\" (UID: \"57cb4093-bdc6-4637-9afc-7364349a96d4\") " pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.070008 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.069986 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswr7\" (UniqueName: \"kubernetes.io/projected/5a52dd25-28a3-4e7e-95e7-856ba22c0c17-kube-api-access-zswr7\") pod \"tuned-gltrt\" (UID: \"5a52dd25-28a3-4e7e-95e7-856ba22c0c17\") " pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.070075 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.070018 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrsnm\" (UniqueName: \"kubernetes.io/projected/fa896c93-030e-48d4-afe6-575b621eca31-kube-api-access-mrsnm\") pod \"multus-mpm2g\" (UID: \"fa896c93-030e-48d4-afe6-575b621eca31\") " pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.071089 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.071071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvx2\" (UniqueName: \"kubernetes.io/projected/bec258cb-b4a0-425e-b582-b392c2becdfe-kube-api-access-mvvx2\") pod \"iptables-alerter-xhfhs\" (UID: \"bec258cb-b4a0-425e-b582-b392c2becdfe\") " pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.131171 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.131131 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:29.163284 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-var-lib-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-etc-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-socket-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-bin\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-env-overrides\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-etc-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-device-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shfnj\" (UniqueName: \"kubernetes.io/projected/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-kube-api-access-shfnj\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163539 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-log-socket\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-var-lib-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzwnb\" (UniqueName: \"kubernetes.io/projected/78932d63-d2fd-4c01-8666-7f65f21faaac-kube-api-access-bzwnb\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-sys-fs\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-socket-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-device-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-log-socket\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.163893 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163500 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-bin\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163919 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163967 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-env-overrides\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.163973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-kubelet\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-sys-fs\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-ovn\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164025 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-kubelet\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-ovn\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-netd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164080 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-script-lib\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj88w\" (UniqueName: \"kubernetes.io/projected/343340da-6202-4b41-8b3d-4e0c0f72ecb6-kube-api-access-dj88w\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-cni-netd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-os-release\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164207 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-os-release\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.164492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-config\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-systemd-units\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78932d63-d2fd-4c01-8666-7f65f21faaac-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-openvswitch\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.164466 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-netns\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-systemd-units\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33c1ecda-fae8-404c-a67f-f189a105cd44-konnectivity-ca\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-run-netns\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164622 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-script-lib\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.164627 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:29.664605446 +0000 UTC m=+3.193182692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-node-log\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33c1ecda-fae8-404c-a67f-f189a105cd44-agent-certs\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164887 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-node-log\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165273 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-registration-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjnh\" (UniqueName: \"kubernetes.io/projected/bc493885-90a9-4fcc-9331-806c0d60be7d-kube-api-access-dkjnh\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cnibin\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.164976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78932d63-d2fd-4c01-8666-7f65f21faaac-ovnkube-config\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-slash\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-system-cni-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc493885-90a9-4fcc-9331-806c0d60be7d-registration-dir\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33c1ecda-fae8-404c-a67f-f189a105cd44-konnectivity-ca\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-slash\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-systemd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-system-cni-dir\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cnibin\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78932d63-d2fd-4c01-8666-7f65f21faaac-run-systemd\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.165793 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.165409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.167099 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.167078 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33c1ecda-fae8-404c-a67f-f189a105cd44-agent-certs\") pod \"konnectivity-agent-hlx77\" (UID: \"33c1ecda-fae8-404c-a67f-f189a105cd44\") " pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.167396 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.167380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78932d63-d2fd-4c01-8666-7f65f21faaac-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.170787 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.170756 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:29.170903 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.170792 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:29.170903 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.170812 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:29.170903 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.170887 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:29.670868751 +0000 UTC m=+3.199445981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:29.173309 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.173274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj88w\" (UniqueName: \"kubernetes.io/projected/343340da-6202-4b41-8b3d-4e0c0f72ecb6-kube-api-access-dj88w\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.173412 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.173275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfnj\" (UniqueName: \"kubernetes.io/projected/652897df-2286-4fbc-9cf6-a7ce5de5d8a3-kube-api-access-shfnj\") pod \"multus-additional-cni-plugins-q2phv\" (UID: \"652897df-2286-4fbc-9cf6-a7ce5de5d8a3\") " pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.173677 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.173661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjnh\" (UniqueName: \"kubernetes.io/projected/bc493885-90a9-4fcc-9331-806c0d60be7d-kube-api-access-dkjnh\") pod \"aws-ebs-csi-driver-node-drxgw\" (UID: \"bc493885-90a9-4fcc-9331-806c0d60be7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.174345 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.174306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzwnb\" (UniqueName: \"kubernetes.io/projected/78932d63-d2fd-4c01-8666-7f65f21faaac-kube-api-access-bzwnb\") pod \"ovnkube-node-9qrz2\" (UID: \"78932d63-d2fd-4c01-8666-7f65f21faaac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.249076 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.248987 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:29.250661 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.250625 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gltrt" Apr 17 11:30:29.263503 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.263353 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7vxqv" Apr 17 11:30:29.272255 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.272219 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ts68p" Apr 17 11:30:29.277904 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.277878 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mpm2g" Apr 17 11:30:29.285577 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.285539 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xhfhs" Apr 17 11:30:29.293332 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.293301 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:29.300052 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.300020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:29.306877 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.306842 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" Apr 17 11:30:29.312606 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.312574 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q2phv" Apr 17 11:30:29.668262 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.668167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:29.668422 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.668335 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:29.668473 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.668425 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:30.668407393 +0000 UTC m=+4.196984623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:29.753476 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.753243 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa896c93_030e_48d4_afe6_575b621eca31.slice/crio-580628bc01adb70f3144de9efe26ace7ebe247d7301c3f6da7b5cde26920cacc WatchSource:0}: Error finding container 580628bc01adb70f3144de9efe26ace7ebe247d7301c3f6da7b5cde26920cacc: Status 404 returned error can't find the container with id 580628bc01adb70f3144de9efe26ace7ebe247d7301c3f6da7b5cde26920cacc Apr 17 11:30:29.755227 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.755197 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78932d63_d2fd_4c01_8666_7f65f21faaac.slice/crio-dfda2a9baa5c5ef14c5cdf3f6a19c240288f79075441537b6419669738782185 WatchSource:0}: Error finding container dfda2a9baa5c5ef14c5cdf3f6a19c240288f79075441537b6419669738782185: Status 404 returned error can't find the container with id dfda2a9baa5c5ef14c5cdf3f6a19c240288f79075441537b6419669738782185 Apr 17 11:30:29.756834 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.756740 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57cb4093_bdc6_4637_9afc_7364349a96d4.slice/crio-b5bb5129fd9a638e3db73f4aec22ef061938e44ed1e5b4de8c38a0307cb29314 WatchSource:0}: Error finding container b5bb5129fd9a638e3db73f4aec22ef061938e44ed1e5b4de8c38a0307cb29314: Status 404 returned error can't find the container with id b5bb5129fd9a638e3db73f4aec22ef061938e44ed1e5b4de8c38a0307cb29314 Apr 17 11:30:29.758263 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.758236 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33c1ecda_fae8_404c_a67f_f189a105cd44.slice/crio-56e395a95965f4467d15cea3a8b5a94f3de679cf9ce807f3b3b9a56aa92489d4 WatchSource:0}: Error finding container 56e395a95965f4467d15cea3a8b5a94f3de679cf9ce807f3b3b9a56aa92489d4: Status 404 returned error can't find the container with id 56e395a95965f4467d15cea3a8b5a94f3de679cf9ce807f3b3b9a56aa92489d4 Apr 17 11:30:29.758972 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.758614 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652897df_2286_4fbc_9cf6_a7ce5de5d8a3.slice/crio-917ae27222bc00be27ff3b84802312d85f024d4639f993b5e91a4ed943a2f6a9 WatchSource:0}: Error finding container 917ae27222bc00be27ff3b84802312d85f024d4639f993b5e91a4ed943a2f6a9: Status 404 returned error can't find the container with id 917ae27222bc00be27ff3b84802312d85f024d4639f993b5e91a4ed943a2f6a9 Apr 17 11:30:29.760763 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.760746 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc493885_90a9_4fcc_9331_806c0d60be7d.slice/crio-2eca7b6b8ba879af11f9027ac02dc88e1815e4c9d71a0c5f67379b1f441def50 WatchSource:0}: Error finding container 2eca7b6b8ba879af11f9027ac02dc88e1815e4c9d71a0c5f67379b1f441def50: Status 404 returned error can't find the container with id 2eca7b6b8ba879af11f9027ac02dc88e1815e4c9d71a0c5f67379b1f441def50 Apr 17 11:30:29.762189 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.761892 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3f50d1_8ddd_40ce_9dee_cc684dedf9d4.slice/crio-cd6ad535974777c06ad82e35587aaa527b6bbd46bd8d685b43c28c9f353928db WatchSource:0}: Error finding container cd6ad535974777c06ad82e35587aaa527b6bbd46bd8d685b43c28c9f353928db: Status 404 returned error can't find the container with id cd6ad535974777c06ad82e35587aaa527b6bbd46bd8d685b43c28c9f353928db Apr 17 11:30:29.763993 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.763946 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec258cb_b4a0_425e_b582_b392c2becdfe.slice/crio-3c17e883c5d17f84bcf9fb1ea9c1f39ba144ffe386cddd9136ff15c3365435e1 WatchSource:0}: Error finding container 3c17e883c5d17f84bcf9fb1ea9c1f39ba144ffe386cddd9136ff15c3365435e1: Status 404 returned error can't find the container with id 3c17e883c5d17f84bcf9fb1ea9c1f39ba144ffe386cddd9136ff15c3365435e1 Apr 17 11:30:29.766001 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:30:29.765978 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a52dd25_28a3_4e7e_95e7_856ba22c0c17.slice/crio-381911abb1f2c54015cfabc72b07b8fcb4597ec8be27df157ec47aa8b336dafe WatchSource:0}: Error finding container 381911abb1f2c54015cfabc72b07b8fcb4597ec8be27df157ec47aa8b336dafe: Status 404 returned error can't find the container with id 381911abb1f2c54015cfabc72b07b8fcb4597ec8be27df157ec47aa8b336dafe Apr 17 11:30:29.768480 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.768455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:29.768659 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.768632 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:29.768716 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.768665 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:29.768716 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.768677 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:29.768819 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:29.768741 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:30.768718881 +0000 UTC m=+4.297296122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:29.989060 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.988944 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:25:27 +0000 UTC" deadline="2027-10-14 01:45:27.20897801 +0000 UTC" Apr 17 11:30:29.989060 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:29.988979 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13070h14m57.220001084s" Apr 17 11:30:30.088296 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.088266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:30.088454 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.088419 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:30.096153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.096105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xhfhs" event={"ID":"bec258cb-b4a0-425e-b582-b392c2becdfe","Type":"ContainerStarted","Data":"3c17e883c5d17f84bcf9fb1ea9c1f39ba144ffe386cddd9136ff15c3365435e1"} Apr 17 11:30:30.099146 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.099114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ts68p" event={"ID":"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4","Type":"ContainerStarted","Data":"cd6ad535974777c06ad82e35587aaa527b6bbd46bd8d685b43c28c9f353928db"} Apr 17 11:30:30.100200 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.100166 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7vxqv" event={"ID":"57cb4093-bdc6-4637-9afc-7364349a96d4","Type":"ContainerStarted","Data":"b5bb5129fd9a638e3db73f4aec22ef061938e44ed1e5b4de8c38a0307cb29314"} Apr 17 11:30:30.101754 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.101730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" event={"ID":"45cba5abf7cd7b00c2f9306a991cc4cf","Type":"ContainerStarted","Data":"d503162bf22087bbde88a62a119f29dea48b55f25593b5a95c31946b7e37dcca"} Apr 17 11:30:30.102726 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.102697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gltrt" event={"ID":"5a52dd25-28a3-4e7e-95e7-856ba22c0c17","Type":"ContainerStarted","Data":"381911abb1f2c54015cfabc72b07b8fcb4597ec8be27df157ec47aa8b336dafe"} Apr 17 11:30:30.103648 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.103623 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" event={"ID":"bc493885-90a9-4fcc-9331-806c0d60be7d","Type":"ContainerStarted","Data":"2eca7b6b8ba879af11f9027ac02dc88e1815e4c9d71a0c5f67379b1f441def50"} Apr 17 11:30:30.104761 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.104721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerStarted","Data":"917ae27222bc00be27ff3b84802312d85f024d4639f993b5e91a4ed943a2f6a9"} Apr 17 11:30:30.105725 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.105705 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hlx77" event={"ID":"33c1ecda-fae8-404c-a67f-f189a105cd44","Type":"ContainerStarted","Data":"56e395a95965f4467d15cea3a8b5a94f3de679cf9ce807f3b3b9a56aa92489d4"} Apr 17 11:30:30.107998 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.107967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"dfda2a9baa5c5ef14c5cdf3f6a19c240288f79075441537b6419669738782185"} Apr 17 11:30:30.109087 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.109068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mpm2g" event={"ID":"fa896c93-030e-48d4-afe6-575b621eca31","Type":"ContainerStarted","Data":"580628bc01adb70f3144de9efe26ace7ebe247d7301c3f6da7b5cde26920cacc"} Apr 17 11:30:30.115620 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.115565 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-210.ec2.internal" podStartSLOduration=2.115544531 podStartE2EDuration="2.115544531s" podCreationTimestamp="2026-04-17 11:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:30:30.114987177 +0000 UTC m=+3.643564422" watchObservedRunningTime="2026-04-17 11:30:30.115544531 +0000 UTC m=+3.644121782" Apr 17 11:30:30.676804 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.676746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:30.676979 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.676910 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:30.677058 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.676980 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:32.67696033 +0000 UTC m=+6.205537581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:30.779267 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:30.777350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:30.779267 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.777569 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:30.779267 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.777592 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:30.779267 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.777605 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:30.779267 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:30.777668 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:32.7776484 +0000 UTC m=+6.306225643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:31.091375 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:31.091281 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:31.091845 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:31.091418 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:31.126186 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:31.126136 2570 generic.go:358] "Generic (PLEG): container finished" podID="eaf9e62be8566fb5d3c6a886f76dde22" containerID="4d2d4c4bcd579ed6bcdf55437d153a9e65f9127bcbbb172767f071e191246d98" exitCode=0 Apr 17 11:30:31.126376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:31.126303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" event={"ID":"eaf9e62be8566fb5d3c6a886f76dde22","Type":"ContainerDied","Data":"4d2d4c4bcd579ed6bcdf55437d153a9e65f9127bcbbb172767f071e191246d98"} Apr 17 11:30:32.088353 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:32.088316 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:32.088570 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.088457 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:32.141271 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:32.141206 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" event={"ID":"eaf9e62be8566fb5d3c6a886f76dde22","Type":"ContainerStarted","Data":"6d4c4a3540221b9637c1c995541494628030598c17c51f12a3043b0bbeca8e7e"} Apr 17 11:30:32.693179 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:32.693137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:32.693376 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.693339 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:32.693451 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.693404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:36.693386401 +0000 UTC m=+10.221963643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:32.794240 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:32.794197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:32.794405 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.794371 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:32.794405 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.794394 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:32.794405 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.794406 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:32.794603 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:32.794467 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:36.794448491 +0000 UTC m=+10.323025736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:33.090849 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:33.090765 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:33.091028 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:33.090895 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:34.088568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:34.088534 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:34.089007 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:34.088691 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:35.088320 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:35.088291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:35.088535 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:35.088402 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:36.088110 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:36.088075 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:36.088571 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.088239 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:36.726626 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:36.726467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:36.726812 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.726649 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:36.726812 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.726719 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:44.726698248 +0000 UTC m=+18.255275505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:36.827255 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:36.827104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:36.827420 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.827304 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:36.827420 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.827330 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:36.827420 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.827346 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:36.827420 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:36.827412 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:44.827392179 +0000 UTC m=+18.355969420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:37.089722 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:37.089280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:37.089722 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:37.089379 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:38.087888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:38.087848 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:38.088069 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:38.088014 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:39.088017 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:39.087979 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:39.088488 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:39.088117 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:40.087896 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:40.087859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:40.088086 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:40.087998 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:41.088225 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:41.088185 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:41.088695 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:41.088318 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:42.088012 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:42.087976 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:42.088166 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:42.088110 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:43.088250 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:43.088210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:43.088666 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:43.088349 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:44.088679 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:44.088641 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:44.089217 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.088776 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:44.781049 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:44.781007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:44.781225 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.781169 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:44.781299 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.781257 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:00.78123415 +0000 UTC m=+34.309811380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:44.882058 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:44.882022 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:44.882239 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.882219 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:44.882294 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.882246 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:44.882294 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.882261 2570 projected.go:194] Error preparing data for projected volume kube-api-access-q8kv4 for pod openshift-network-diagnostics/network-check-target-5r2k5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:44.882374 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:44.882326 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4 podName:d764f3ad-e076-4b99-8a6f-716b6d83c925 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:00.88230717 +0000 UTC m=+34.410884419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kv4" (UniqueName: "kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4") pod "network-check-target-5r2k5" (UID: "d764f3ad-e076-4b99-8a6f-716b6d83c925") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:45.088284 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:45.088198 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:45.088423 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:45.088343 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:46.088036 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:46.088001 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:46.088430 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:46.088122 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:47.089369 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:47.088882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:47.089369 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:47.088991 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:48.088917 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.088648 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:48.089076 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:48.088976 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:48.157996 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.157965 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:30:48.172749 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.172712 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="aedc4a7a469f151829a14657fca7e09288363318c643f4d73d8abde7cd2251fb" exitCode=0 Apr 17 11:30:48.172900 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.172802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"aedc4a7a469f151829a14657fca7e09288363318c643f4d73d8abde7cd2251fb"} Apr 17 11:30:48.174117 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.174091 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hlx77" event={"ID":"33c1ecda-fae8-404c-a67f-f189a105cd44","Type":"ContainerStarted","Data":"228c1e60e44580c0464a695efa65c52bf922eade8c21da9e1b7360bf6d99820e"} Apr 17 11:30:48.176864 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"fb9691b2440a50d506637bdbe9d937601b4712bd5450a296f8be115fe236fe36"} Apr 17 11:30:48.176944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176873 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"eac84b71ff30bd3a574b387de4c7f9e430677b73cc1eafd65768986effa81be3"} Apr 17 11:30:48.176944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176889 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"73443c59c0f802334e07b2ce205e9fc7d1d7686154e4a7e7979769d61e42df29"} Apr 17 11:30:48.176944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176903 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"24e0911559caf3099084815193d166db842739677c318871ad0b1ceb8fc2a8e9"} Apr 17 11:30:48.176944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"e762d762b613630d35d190bfec22c45ae03ceb43daa4bcae025f17b5e9de1476"} Apr 17 11:30:48.176944 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.176927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"0b4adf9b18e014d5b50834440a6f4809d37a85edba5d8b30220e70d5c39698a6"} Apr 17 11:30:48.179618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.179592 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mpm2g" event={"ID":"fa896c93-030e-48d4-afe6-575b621eca31","Type":"ContainerStarted","Data":"f4e643ba60c8e2cabeb180287a1efed8e6d957aff75d4ea8457d04374daae28d"} Apr 17 11:30:48.180846 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.180823 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ts68p" event={"ID":"3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4","Type":"ContainerStarted","Data":"f08d2c294db43a9e66197155af44738321de77b48bdf3e19ff354483ecbc9b10"} Apr 17 11:30:48.181969 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.181941 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7vxqv" event={"ID":"57cb4093-bdc6-4637-9afc-7364349a96d4","Type":"ContainerStarted","Data":"aff1da322620efa495aa143c1ad5b5246e3821aee16371bc6f11909344a4a3de"} Apr 17 11:30:48.183009 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.182981 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gltrt" event={"ID":"5a52dd25-28a3-4e7e-95e7-856ba22c0c17","Type":"ContainerStarted","Data":"93b723250872f154b805b3e77850f49eb557bbc3d29cf9f2e98100400e89b753"} Apr 17 11:30:48.184475 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.184455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" event={"ID":"bc493885-90a9-4fcc-9331-806c0d60be7d","Type":"ContainerStarted","Data":"883482cca13e08978ce700351227dd4498e611b4449aa1f40bc5e4f9a6d171a2"} Apr 17 11:30:48.184570 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.184480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" event={"ID":"bc493885-90a9-4fcc-9331-806c0d60be7d","Type":"ContainerStarted","Data":"f00520ea2bfaf483ea2b1842ebf1d70e5ac876045d3ade22dc5ec05a81272a1e"} Apr 17 11:30:48.194684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.194638 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-210.ec2.internal" podStartSLOduration=20.194624301 podStartE2EDuration="20.194624301s" podCreationTimestamp="2026-04-17 11:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:30:32.156943226 +0000 UTC m=+5.685520476" watchObservedRunningTime="2026-04-17 11:30:48.194624301 +0000 UTC m=+21.723201549" Apr 17 11:30:48.209421 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.209366 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gltrt" podStartSLOduration=4.12074291 podStartE2EDuration="21.209353496s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.767937033 +0000 UTC m=+3.296514276" lastFinishedPulling="2026-04-17 11:30:46.856547618 +0000 UTC m=+20.385124862" observedRunningTime="2026-04-17 11:30:48.209045139 +0000 UTC m=+21.737622409" watchObservedRunningTime="2026-04-17 11:30:48.209353496 +0000 UTC m=+21.737930745" Apr 17 11:30:48.223997 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.223952 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7vxqv" podStartSLOduration=3.944012655 podStartE2EDuration="21.223938746s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.759214845 +0000 UTC m=+3.287792086" lastFinishedPulling="2026-04-17 11:30:47.039140937 +0000 UTC m=+20.567718177" observedRunningTime="2026-04-17 11:30:48.223706806 +0000 UTC m=+21.752284055" watchObservedRunningTime="2026-04-17 11:30:48.223938746 +0000 UTC m=+21.752515994" Apr 17 11:30:48.237820 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.237772 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hlx77" podStartSLOduration=4.12044917 podStartE2EDuration="21.23775697s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.760113611 +0000 UTC m=+3.288690841" lastFinishedPulling="2026-04-17 11:30:46.877421413 +0000 UTC m=+20.405998641" observedRunningTime="2026-04-17 11:30:48.237377461 +0000 UTC m=+21.765954709" watchObservedRunningTime="2026-04-17 11:30:48.23775697 +0000 UTC m=+21.766334219" Apr 17 11:30:48.250167 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.250127 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ts68p" podStartSLOduration=4.139940546 podStartE2EDuration="21.250112715s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.767236634 +0000 UTC m=+3.295813876" lastFinishedPulling="2026-04-17 11:30:46.877408816 +0000 UTC m=+20.405986045" observedRunningTime="2026-04-17 11:30:48.250111145 +0000 UTC m=+21.778688394" watchObservedRunningTime="2026-04-17 11:30:48.250112715 +0000 UTC m=+21.778689963" Apr 17 11:30:48.264847 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:48.264803 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mpm2g" podStartSLOduration=3.981015202 podStartE2EDuration="21.264787927s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.755116305 +0000 UTC m=+3.283693532" lastFinishedPulling="2026-04-17 11:30:47.038889017 +0000 UTC m=+20.567466257" observedRunningTime="2026-04-17 11:30:48.264767269 +0000 UTC m=+21.793344521" watchObservedRunningTime="2026-04-17 11:30:48.264787927 +0000 UTC m=+21.793365176" Apr 17 11:30:49.027974 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.027766 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:30:48.157990064Z","UUID":"d70b2870-d426-43a9-b4f4-a1b4d304998c","Handler":null,"Name":"","Endpoint":""} Apr 17 11:30:49.029575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.029553 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:30:49.029695 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.029583 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:30:49.088182 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.088148 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:49.088343 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:49.088285 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:49.188852 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.188757 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" event={"ID":"bc493885-90a9-4fcc-9331-806c0d60be7d","Type":"ContainerStarted","Data":"63945bf368a51a3ccc90b9633a7464dbe5651eb9ac26b2b72da9ec581e58c352"} Apr 17 11:30:49.190423 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.190385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xhfhs" event={"ID":"bec258cb-b4a0-425e-b582-b392c2becdfe","Type":"ContainerStarted","Data":"7e387c97355b7458da60203bc1a5376883d7089c16bc9af4de6559388e98cacb"} Apr 17 11:30:49.205392 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.205337 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-drxgw" podStartSLOduration=3.062790935 podStartE2EDuration="22.20532385s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.763667323 +0000 UTC m=+3.292244550" lastFinishedPulling="2026-04-17 11:30:48.906200233 +0000 UTC m=+22.434777465" observedRunningTime="2026-04-17 11:30:49.204961086 +0000 UTC m=+22.733538335" watchObservedRunningTime="2026-04-17 11:30:49.20532385 +0000 UTC m=+22.733901095" Apr 17 11:30:49.218721 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:49.218660 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xhfhs" podStartSLOduration=5.108291744 podStartE2EDuration="22.218640811s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.767062006 +0000 UTC m=+3.295639252" lastFinishedPulling="2026-04-17 11:30:46.877411085 +0000 UTC m=+20.405988319" observedRunningTime="2026-04-17 11:30:49.218346695 +0000 UTC m=+22.746923944" watchObservedRunningTime="2026-04-17 11:30:49.218640811 +0000 UTC m=+22.747218062" Apr 17 11:30:50.088128 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:50.087816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:50.088304 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:50.088200 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:50.195588 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:50.195549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"6962d14decb5fe35971257d5aaf6a44364c8045f29bf45099e08026b0ca3e0f7"} Apr 17 11:30:50.392290 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:50.392192 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:50.392990 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:50.392967 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:51.088065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:51.088031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:51.088291 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:51.088144 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:52.088598 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.088557 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:52.089249 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:52.088679 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:52.203697 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.203488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" event={"ID":"78932d63-d2fd-4c01-8666-7f65f21faaac","Type":"ContainerStarted","Data":"95a5136e7f00c8ca835e19122c7222d0fddf532b1715e6f77ba24a2c74b45f9a"} Apr 17 11:30:52.204138 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.204046 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:52.204138 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.204082 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:52.223547 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.223472 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:52.231817 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:52.231751 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" podStartSLOduration=7.884820777 podStartE2EDuration="25.23173204s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.757099021 +0000 UTC m=+3.285676250" lastFinishedPulling="2026-04-17 11:30:47.104010272 +0000 UTC m=+20.632587513" observedRunningTime="2026-04-17 11:30:52.231023025 +0000 UTC m=+25.759600275" watchObservedRunningTime="2026-04-17 11:30:52.23173204 +0000 UTC m=+25.760309289" Apr 17 11:30:53.087851 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:53.087815 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:53.088008 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:53.087921 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:53.206461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:53.206425 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="6d046fff8a4196400278ce019255027b2d1c5ddd8f65830219dde91b1353e756" exitCode=0 Apr 17 11:30:53.206921 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:53.206498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"6d046fff8a4196400278ce019255027b2d1c5ddd8f65830219dde91b1353e756"} Apr 17 11:30:53.207834 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:53.207151 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:53.222405 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:53.222377 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:30:54.036747 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:54.036711 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5r2k5"] Apr 17 11:30:54.036909 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:54.036854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:54.036983 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:54.036958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:54.039550 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:54.039502 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z52nx"] Apr 17 11:30:54.039683 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:54.039671 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:54.039821 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:54.039798 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:55.214927 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:55.214878 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="96de41ebbc9fb57f935be102e6b0af2287ff35866b77577418b56216cabb9b93" exitCode=0 Apr 17 11:30:55.215312 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:55.214950 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"96de41ebbc9fb57f935be102e6b0af2287ff35866b77577418b56216cabb9b93"} Apr 17 11:30:56.087829 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:56.087754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:56.087829 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:56.087807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:56.088009 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:56.087893 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:56.088045 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:56.087998 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:57.221773 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:57.221731 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="8f7616da6084215e62d543456c8de9926546b792c15843948e1c7fa9856d62ea" exitCode=0 Apr 17 11:30:57.222285 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:57.221799 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"8f7616da6084215e62d543456c8de9926546b792c15843948e1c7fa9856d62ea"} Apr 17 11:30:58.088416 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.088383 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:30:58.088643 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.088383 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:30:58.088643 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:58.088499 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5r2k5" podUID="d764f3ad-e076-4b99-8a6f-716b6d83c925" Apr 17 11:30:58.088643 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:58.088615 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:30:58.386328 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.386238 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:58.386801 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.386398 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:30:58.386897 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.386876 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hlx77" Apr 17 11:30:58.780908 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.780877 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-210.ec2.internal" event="NodeReady" Apr 17 11:30:58.781098 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.781035 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:30:58.835043 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.834999 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cxbf8"] Apr 17 11:30:58.849729 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.849688 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pdqss"] Apr 17 11:30:58.850656 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.849882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:58.852464 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.852286 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:30:58.852626 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.852544 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ltml9\"" Apr 17 11:30:58.852626 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.852585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:30:58.864794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.864763 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cxbf8"] Apr 17 11:30:58.864939 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.864813 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pdqss"] Apr 17 11:30:58.864939 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.864934 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:58.867407 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.867378 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:30:58.867571 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.867451 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:30:58.867621 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.867391 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:30:58.867677 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.867386 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hmj98\"" Apr 17 11:30:58.992079 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.991991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:58.992079 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.992052 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfwd\" (UniqueName: \"kubernetes.io/projected/e3bee7d3-b329-44aa-922f-a04ce5b599e7-kube-api-access-xxfwd\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:58.992299 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.992165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:58.992299 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.992214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6hw\" (UniqueName: \"kubernetes.io/projected/66f573d5-80a9-4ecf-ad6b-6cf684898a74-kube-api-access-zm6hw\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:58.992299 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.992275 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f573d5-80a9-4ecf-ad6b-6cf684898a74-config-volume\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:58.992418 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:58.992331 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66f573d5-80a9-4ecf-ad6b-6cf684898a74-tmp-dir\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.092730 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092677 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfwd\" (UniqueName: \"kubernetes.io/projected/e3bee7d3-b329-44aa-922f-a04ce5b599e7-kube-api-access-xxfwd\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:59.092915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.092915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6hw\" (UniqueName: \"kubernetes.io/projected/66f573d5-80a9-4ecf-ad6b-6cf684898a74-kube-api-access-zm6hw\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.092915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f573d5-80a9-4ecf-ad6b-6cf684898a74-config-volume\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.092915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092858 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66f573d5-80a9-4ecf-ad6b-6cf684898a74-tmp-dir\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.092915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.092886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:59.093161 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.092918 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:59.093161 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.092976 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:59.093161 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.092987 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:59.592964236 +0000 UTC m=+33.121541480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:30:59.093161 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.093014 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:59.593002533 +0000 UTC m=+33.121579766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:30:59.093361 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.093223 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66f573d5-80a9-4ecf-ad6b-6cf684898a74-tmp-dir\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.093583 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.093562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f573d5-80a9-4ecf-ad6b-6cf684898a74-config-volume\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.103568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.103536 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6hw\" (UniqueName: \"kubernetes.io/projected/66f573d5-80a9-4ecf-ad6b-6cf684898a74-kube-api-access-zm6hw\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.103716 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.103633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfwd\" (UniqueName: \"kubernetes.io/projected/e3bee7d3-b329-44aa-922f-a04ce5b599e7-kube-api-access-xxfwd\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:59.597843 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.597809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:30:59.598657 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:30:59.597880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:30:59.598657 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.597995 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:59.598657 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.597997 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:59.598657 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.598061 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:00.598042439 +0000 UTC m=+34.126619667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:30:59.598657 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:30:59.598080 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:00.598071314 +0000 UTC m=+34.126648541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:00.088472 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.088387 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:31:00.088767 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.088387 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:31:00.092253 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.091935 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lzg8w\"" Apr 17 11:31:00.092253 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.091950 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:31:00.092253 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.092062 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:31:00.092253 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.092087 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:31:00.092253 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.092218 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fqsgp\"" Apr 17 11:31:00.604397 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.604360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:31:00.604865 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.604424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:31:00.604865 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.604564 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:00.604865 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.604568 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:00.604865 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.604638 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:02.604616023 +0000 UTC m=+36.133193295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:00.604865 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.604660 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:02.604648809 +0000 UTC m=+36.133226041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:31:00.805731 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.805692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:31:00.805926 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.805843 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:31:00.805926 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:00.805920 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:32.805894226 +0000 UTC m=+66.334471454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : secret "metrics-daemon-secret" not found Apr 17 11:31:00.906224 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.906136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:31:00.920227 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:00.920194 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kv4\" (UniqueName: \"kubernetes.io/projected/d764f3ad-e076-4b99-8a6f-716b6d83c925-kube-api-access-q8kv4\") pod \"network-check-target-5r2k5\" (UID: \"d764f3ad-e076-4b99-8a6f-716b6d83c925\") " pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:31:01.010346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:01.010107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:31:02.621032 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:02.620982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:31:02.621478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:02.621085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:31:02.621478 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:02.621146 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:02.621478 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:02.621194 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:02.621478 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:02.621234 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:06.621216982 +0000 UTC m=+40.149794222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:31:02.621478 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:02.621254 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:06.621245233 +0000 UTC m=+40.149822459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:03.021972 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:03.021940 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5r2k5"] Apr 17 11:31:03.117058 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:31:03.117017 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd764f3ad_e076_4b99_8a6f_716b6d83c925.slice/crio-b19e094302ad8d0fd3cb429dcae619c6cef08138f9c2083ecff40ac53230e619 WatchSource:0}: Error finding container b19e094302ad8d0fd3cb429dcae619c6cef08138f9c2083ecff40ac53230e619: Status 404 returned error can't find the container with id b19e094302ad8d0fd3cb429dcae619c6cef08138f9c2083ecff40ac53230e619 Apr 17 11:31:03.234920 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:03.234883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5r2k5" event={"ID":"d764f3ad-e076-4b99-8a6f-716b6d83c925","Type":"ContainerStarted","Data":"b19e094302ad8d0fd3cb429dcae619c6cef08138f9c2083ecff40ac53230e619"} Apr 17 11:31:04.240333 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.240299 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="0f9ba1586d5dae0eff2a02821c602c58b5f2ad95b7dc96ee03ef88751edcc90a" exitCode=0 Apr 17 11:31:04.240803 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.240357 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"0f9ba1586d5dae0eff2a02821c602c58b5f2ad95b7dc96ee03ef88751edcc90a"} Apr 17 11:31:04.792921 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.792838 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb"] Apr 17 11:31:04.807864 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.807824 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb"] Apr 17 11:31:04.808024 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.807965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:04.811863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.811830 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 11:31:04.813283 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.812207 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:31:04.813283 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.812282 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:31:04.813283 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.812211 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:31:04.939155 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.939119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllgf\" (UniqueName: \"kubernetes.io/projected/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-kube-api-access-kllgf\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:04.939155 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.939164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-tmp\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:04.939353 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:04.939184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.039712 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.039674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kllgf\" (UniqueName: \"kubernetes.io/projected/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-kube-api-access-kllgf\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.039888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.039734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-tmp\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.039888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.039752 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.040564 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.040501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-tmp\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.044898 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.044842 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.047784 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.047755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllgf\" (UniqueName: \"kubernetes.io/projected/8d6a3bcb-48e8-419c-bf32-5d7d48940e2c-kube-api-access-kllgf\") pod \"klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb\" (UID: \"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.121029 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.120987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:05.245799 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.245761 2570 generic.go:358] "Generic (PLEG): container finished" podID="652897df-2286-4fbc-9cf6-a7ce5de5d8a3" containerID="597a98fdad63f81c88c60fa238396539d5eea0b9b29de1160af555c837bab98e" exitCode=0 Apr 17 11:31:05.246186 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:05.245825 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerDied","Data":"597a98fdad63f81c88c60fa238396539d5eea0b9b29de1160af555c837bab98e"} Apr 17 11:31:06.084490 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.084462 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb"] Apr 17 11:31:06.087931 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:31:06.087899 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6a3bcb_48e8_419c_bf32_5d7d48940e2c.slice/crio-55b494217810a119289dbccd5c5b85769c579df7697885c0d727045017b8c2c1 WatchSource:0}: Error finding container 55b494217810a119289dbccd5c5b85769c579df7697885c0d727045017b8c2c1: Status 404 returned error can't find the container with id 55b494217810a119289dbccd5c5b85769c579df7697885c0d727045017b8c2c1 Apr 17 11:31:06.249095 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.249010 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" event={"ID":"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c","Type":"ContainerStarted","Data":"55b494217810a119289dbccd5c5b85769c579df7697885c0d727045017b8c2c1"} Apr 17 11:31:06.251992 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.251960 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q2phv" event={"ID":"652897df-2286-4fbc-9cf6-a7ce5de5d8a3","Type":"ContainerStarted","Data":"983464203be073565a009b24515ac6646a8bc76aa20f3d22b029b2749dabf477"} Apr 17 11:31:06.274526 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.274473 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q2phv" podStartSLOduration=5.878847994 podStartE2EDuration="39.274457872s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:29.762164931 +0000 UTC m=+3.290742161" lastFinishedPulling="2026-04-17 11:31:03.157774803 +0000 UTC m=+36.686352039" observedRunningTime="2026-04-17 11:31:06.273672909 +0000 UTC m=+39.802250159" watchObservedRunningTime="2026-04-17 11:31:06.274457872 +0000 UTC m=+39.803035133" Apr 17 11:31:06.651667 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.651616 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:31:06.651867 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:06.651681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:31:06.651867 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:06.651788 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:06.651867 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:06.651792 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:06.651867 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:06.651840 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:14.651826916 +0000 UTC m=+48.180404143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:31:06.651867 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:06.651868 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:14.651850792 +0000 UTC m=+48.180428027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:07.256029 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:07.255978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5r2k5" event={"ID":"d764f3ad-e076-4b99-8a6f-716b6d83c925","Type":"ContainerStarted","Data":"fcbac3445204e6f4f4bbd1c8e1f29adfabc4ee1921c212430d238764eafef071"} Apr 17 11:31:07.256535 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:07.256306 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:31:07.271318 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:07.271267 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5r2k5" podStartSLOduration=37.094320512 podStartE2EDuration="40.271247123s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:31:03.134071648 +0000 UTC m=+36.662648875" lastFinishedPulling="2026-04-17 11:31:06.310998245 +0000 UTC m=+39.839575486" observedRunningTime="2026-04-17 11:31:07.270701811 +0000 UTC m=+40.799279059" watchObservedRunningTime="2026-04-17 11:31:07.271247123 +0000 UTC m=+40.799824373" Apr 17 11:31:10.264284 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:10.264243 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" event={"ID":"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c","Type":"ContainerStarted","Data":"3d6390627b2ae39db758d31a98a306773fad90f298e4be440da598aa1e1a4030"} Apr 17 11:31:10.264744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:10.264451 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:10.266068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:10.266044 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:31:10.284891 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:10.284841 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" podStartSLOduration=2.6932391 podStartE2EDuration="6.284825526s" podCreationTimestamp="2026-04-17 11:31:04 +0000 UTC" firstStartedPulling="2026-04-17 11:31:06.089735054 +0000 UTC m=+39.618312284" lastFinishedPulling="2026-04-17 11:31:09.681321479 +0000 UTC m=+43.209898710" observedRunningTime="2026-04-17 11:31:10.284304055 +0000 UTC m=+43.812881303" watchObservedRunningTime="2026-04-17 11:31:10.284825526 +0000 UTC m=+43.813402765" Apr 17 11:31:14.708759 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:14.708714 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:31:14.709220 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:14.708773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:31:14.709220 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:14.708865 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:14.709220 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:14.708868 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:14.709220 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:14.708927 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:30.70891309 +0000 UTC m=+64.237490317 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:31:14.709220 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:14.708940 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:30.708934114 +0000 UTC m=+64.237511341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:25.232011 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:25.231976 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrz2" Apr 17 11:31:30.729469 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:30.729413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:31:30.729946 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:30.729579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:31:30.729946 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:30.729609 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:30.729946 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:30.729687 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:02.72967136 +0000 UTC m=+96.258248587 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:31:30.729946 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:30.729699 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:30.729946 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:30.729765 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:02.729746707 +0000 UTC m=+96.258323936 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:31:32.842340 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:32.842298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:31:32.842740 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:32.842406 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:31:32.842740 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:31:32.842470 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:36.842454911 +0000 UTC m=+130.371032137 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : secret "metrics-daemon-secret" not found Apr 17 11:31:38.261218 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:31:38.261185 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5r2k5" Apr 17 11:32:02.740229 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:02.740071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:32:02.740229 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:02.740148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:32:02.740774 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:02.740228 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:32:02.740774 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:02.740249 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:32:02.740774 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:02.740324 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert podName:e3bee7d3-b329-44aa-922f-a04ce5b599e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:06.740304257 +0000 UTC m=+160.268881499 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert") pod "ingress-canary-pdqss" (UID: "e3bee7d3-b329-44aa-922f-a04ce5b599e7") : secret "canary-serving-cert" not found Apr 17 11:32:02.740774 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:02.740345 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls podName:66f573d5-80a9-4ecf-ad6b-6cf684898a74 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:06.740336769 +0000 UTC m=+160.268914021 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls") pod "dns-default-cxbf8" (UID: "66f573d5-80a9-4ecf-ad6b-6cf684898a74") : secret "dns-default-metrics-tls" not found Apr 17 11:32:24.645090 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.645057 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm"] Apr 17 11:32:24.646809 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.646793 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" Apr 17 11:32:24.659085 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.659065 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pbnbs\"" Apr 17 11:32:24.666117 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.666092 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln"] Apr 17 11:32:24.667824 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.667805 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm"] Apr 17 11:32:24.667913 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.667891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.675580 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.675564 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 11:32:24.676247 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.676233 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:32:24.676456 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.676442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-l7b5g\"" Apr 17 11:32:24.678009 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.677996 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:32:24.682014 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.681993 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 11:32:24.691651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.691628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqd6\" (UniqueName: \"kubernetes.io/projected/431fab13-4e22-4667-b273-df590c4b98bd-kube-api-access-bcqd6\") pod \"network-check-source-8894fc9bd-78fsm\" (UID: \"431fab13-4e22-4667-b273-df590c4b98bd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" Apr 17 11:32:24.699429 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.699409 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln"] Apr 17 11:32:24.743748 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.743719 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb"] Apr 17 11:32:24.745429 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.745413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.749347 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.749326 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 11:32:24.749347 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.749343 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:32:24.749550 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.749344 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 11:32:24.749903 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.749880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-z6mbt\"" Apr 17 11:32:24.750922 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.750899 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq"] Apr 17 11:32:24.752725 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.752707 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:32:24.752860 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.752846 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.753499 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.753480 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 11:32:24.754476 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.754460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.755657 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.755643 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5wgs9\"" Apr 17 11:32:24.756396 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.756377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 11:32:24.756494 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.756425 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 11:32:24.756494 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.756455 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 11:32:24.756613 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.756556 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:32:24.757548 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.757534 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:32:24.759228 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.759211 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:32:24.759311 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.759258 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:32:24.759474 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.759457 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5ntx8\"" Apr 17 11:32:24.763456 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.763433 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq"] Apr 17 11:32:24.766286 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.766267 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:32:24.769268 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.769248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb"] Apr 17 11:32:24.774525 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.774489 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:32:24.792950 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.792920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.793099 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.792965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tlq\" (UniqueName: \"kubernetes.io/projected/6a5d3f15-276a-483f-8bff-f93a79e3882e-kube-api-access-k8tlq\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.793099 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.793017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.793099 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.793087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.793220 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.793123 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25qdc\" (UniqueName: \"kubernetes.io/projected/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-kube-api-access-25qdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.793220 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.793150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqd6\" (UniqueName: \"kubernetes.io/projected/431fab13-4e22-4667-b273-df590c4b98bd-kube-api-access-bcqd6\") pod \"network-check-source-8894fc9bd-78fsm\" (UID: \"431fab13-4e22-4667-b273-df590c4b98bd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" Apr 17 11:32:24.793220 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.793187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6a5d3f15-276a-483f-8bff-f93a79e3882e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.806496 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.806470 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqd6\" (UniqueName: \"kubernetes.io/projected/431fab13-4e22-4667-b273-df590c4b98bd-kube-api-access-bcqd6\") pod \"network-check-source-8894fc9bd-78fsm\" (UID: \"431fab13-4e22-4667-b273-df590c4b98bd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" Apr 17 11:32:24.894026 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.893992 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894026 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9ht\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894234 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf5f208-16b3-41eb-b3eb-0b10391e7e74-config\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.894234 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25qdc\" (UniqueName: \"kubernetes.io/projected/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-kube-api-access-25qdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.894234 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894234 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6a5d3f15-276a-483f-8bff-f93a79e3882e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.894376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7snt\" (UniqueName: \"kubernetes.io/projected/acf5f208-16b3-41eb-b3eb-0b10391e7e74-kube-api-access-s7snt\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.894376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894540 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.894540 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tlq\" (UniqueName: \"kubernetes.io/projected/6a5d3f15-276a-483f-8bff-f93a79e3882e-kube-api-access-k8tlq\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.894540 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894650 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.894650 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894584 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf5f208-16b3-41eb-b3eb-0b10391e7e74-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.894650 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.894762 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894712 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.894850 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:24.894833 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:24.894913 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.894898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6a5d3f15-276a-483f-8bff-f93a79e3882e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.894963 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:24.894913 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:25.394893386 +0000 UTC m=+118.923470623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:24.895153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.895113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.896636 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.896615 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.902865 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.902842 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tlq\" (UniqueName: \"kubernetes.io/projected/6a5d3f15-276a-483f-8bff-f93a79e3882e-kube-api-access-k8tlq\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:24.903076 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.903059 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25qdc\" (UniqueName: \"kubernetes.io/projected/74b00ba1-c80d-4505-a365-8d7a3c6c7f9a-kube-api-access-25qdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-wv6fb\" (UID: \"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:24.956114 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.956078 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" Apr 17 11:32:24.995316 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7snt\" (UniqueName: \"kubernetes.io/projected/acf5f208-16b3-41eb-b3eb-0b10391e7e74-kube-api-access-s7snt\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995377 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995466 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf5f208-16b3-41eb-b3eb-0b10391e7e74-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.995488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:24.995551 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:24.995574 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8648f7995c-dng8z: secret "image-registry-tls" not found Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995616 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9ht\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:24.995635 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls podName:6b8643c8-eca1-4bb8-957a-ed244556d475 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:25.495614308 +0000 UTC m=+119.024191556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls") pod "image-registry-8648f7995c-dng8z" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475") : secret "image-registry-tls" not found Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf5f208-16b3-41eb-b3eb-0b10391e7e74-config\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.995710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.996053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.996554 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.996394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf5f208-16b3-41eb-b3eb-0b10391e7e74-config\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.996874 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.996743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.997040 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.997019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.997854 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.997833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf5f208-16b3-41eb-b3eb-0b10391e7e74-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:24.998356 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.998322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:24.998460 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:24.998416 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:25.006697 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.006669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:25.006797 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.006739 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7snt\" (UniqueName: \"kubernetes.io/projected/acf5f208-16b3-41eb-b3eb-0b10391e7e74-kube-api-access-s7snt\") pod \"service-ca-operator-d6fc45fc5-n7ngq\" (UID: \"acf5f208-16b3-41eb-b3eb-0b10391e7e74\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:25.007046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.007029 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9ht\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:25.054162 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.054129 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" Apr 17 11:32:25.061955 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.061930 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" Apr 17 11:32:25.073196 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.073170 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm"] Apr 17 11:32:25.075614 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:25.075567 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod431fab13_4e22_4667_b273_df590c4b98bd.slice/crio-2d997d299ad8a3909f459c71a72c7939c6de531cccfb39da834c526a76e6673a WatchSource:0}: Error finding container 2d997d299ad8a3909f459c71a72c7939c6de531cccfb39da834c526a76e6673a: Status 404 returned error can't find the container with id 2d997d299ad8a3909f459c71a72c7939c6de531cccfb39da834c526a76e6673a Apr 17 11:32:25.184033 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.184001 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb"] Apr 17 11:32:25.203005 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.202972 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq"] Apr 17 11:32:25.206200 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:25.206176 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf5f208_16b3_41eb_b3eb_0b10391e7e74.slice/crio-b5c49048f02a4f3d9b14adf578cb42759a8c309d40b5fa6ac874b8cdcba48030 WatchSource:0}: Error finding container b5c49048f02a4f3d9b14adf578cb42759a8c309d40b5fa6ac874b8cdcba48030: Status 404 returned error can't find the container with id b5c49048f02a4f3d9b14adf578cb42759a8c309d40b5fa6ac874b8cdcba48030 Apr 17 11:32:25.397476 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.397434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:25.397668 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:25.397609 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:25.397712 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:25.397680 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:26.397661278 +0000 UTC m=+119.926238511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:25.407067 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.406995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" event={"ID":"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a","Type":"ContainerStarted","Data":"d42cf916c2b37eb802498f5463eb571b5a6f325579604ddfea9850d7387fb2f9"} Apr 17 11:32:25.408054 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.408023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" event={"ID":"acf5f208-16b3-41eb-b3eb-0b10391e7e74","Type":"ContainerStarted","Data":"b5c49048f02a4f3d9b14adf578cb42759a8c309d40b5fa6ac874b8cdcba48030"} Apr 17 11:32:25.409270 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.409251 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" event={"ID":"431fab13-4e22-4667-b273-df590c4b98bd","Type":"ContainerStarted","Data":"ef2b260ea9eda484b8e046990571267059e40659a2c11fbc3d59a40128ededf0"} Apr 17 11:32:25.409381 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.409273 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" event={"ID":"431fab13-4e22-4667-b273-df590c4b98bd","Type":"ContainerStarted","Data":"2d997d299ad8a3909f459c71a72c7939c6de531cccfb39da834c526a76e6673a"} Apr 17 11:32:25.427424 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.427321 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-78fsm" podStartSLOduration=1.427305151 podStartE2EDuration="1.427305151s" podCreationTimestamp="2026-04-17 11:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:32:25.42719208 +0000 UTC m=+118.955769332" watchObservedRunningTime="2026-04-17 11:32:25.427305151 +0000 UTC m=+118.955882403" Apr 17 11:32:25.498234 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:25.498198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:25.498394 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:25.498331 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:32:25.498394 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:25.498346 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8648f7995c-dng8z: secret "image-registry-tls" not found Apr 17 11:32:25.498460 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:25.498399 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls podName:6b8643c8-eca1-4bb8-957a-ed244556d475 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:26.498386171 +0000 UTC m=+120.026963398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls") pod "image-registry-8648f7995c-dng8z" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475") : secret "image-registry-tls" not found Apr 17 11:32:26.406784 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:26.406407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:26.406784 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:26.406595 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:26.406784 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:26.406667 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:28.406644828 +0000 UTC m=+121.935222076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:26.507113 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:26.507065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:26.507307 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:26.507239 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:32:26.507307 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:26.507255 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8648f7995c-dng8z: secret "image-registry-tls" not found Apr 17 11:32:26.507417 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:26.507323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls podName:6b8643c8-eca1-4bb8-957a-ed244556d475 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:28.507302326 +0000 UTC m=+122.035879570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls") pod "image-registry-8648f7995c-dng8z" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475") : secret "image-registry-tls" not found Apr 17 11:32:28.417684 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.417641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" event={"ID":"acf5f208-16b3-41eb-b3eb-0b10391e7e74","Type":"ContainerStarted","Data":"3b2e00027d2bd2182dd8cb95864e83621412aecc859e01a9aa8f84269b27176e"} Apr 17 11:32:28.419039 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.419008 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" event={"ID":"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a","Type":"ContainerStarted","Data":"639fbaa3bd3e2065c43414fc669248b323486e801452b48b6332f5278cc40721"} Apr 17 11:32:28.424571 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.424546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:28.424675 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:28.424660 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:28.424729 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:28.424718 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:32.424704015 +0000 UTC m=+125.953281246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:28.444337 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.444292 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" podStartSLOduration=2.15789045 podStartE2EDuration="4.444280394s" podCreationTimestamp="2026-04-17 11:32:24 +0000 UTC" firstStartedPulling="2026-04-17 11:32:25.20837367 +0000 UTC m=+118.736950896" lastFinishedPulling="2026-04-17 11:32:27.494763611 +0000 UTC m=+121.023340840" observedRunningTime="2026-04-17 11:32:28.443524495 +0000 UTC m=+121.972101737" watchObservedRunningTime="2026-04-17 11:32:28.444280394 +0000 UTC m=+121.972857642" Apr 17 11:32:28.457749 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.457705 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" podStartSLOduration=2.153243617 podStartE2EDuration="4.457692042s" podCreationTimestamp="2026-04-17 11:32:24 +0000 UTC" firstStartedPulling="2026-04-17 11:32:25.190705615 +0000 UTC m=+118.719282849" lastFinishedPulling="2026-04-17 11:32:27.495154046 +0000 UTC m=+121.023731274" observedRunningTime="2026-04-17 11:32:28.45732572 +0000 UTC m=+121.985902966" watchObservedRunningTime="2026-04-17 11:32:28.457692042 +0000 UTC m=+121.986269290" Apr 17 11:32:28.524943 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:28.524909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:28.525133 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:28.525047 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:32:28.525133 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:28.525071 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8648f7995c-dng8z: secret "image-registry-tls" not found Apr 17 11:32:28.525133 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:28.525128 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls podName:6b8643c8-eca1-4bb8-957a-ed244556d475 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:32.525110161 +0000 UTC m=+126.053687404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls") pod "image-registry-8648f7995c-dng8z" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475") : secret "image-registry-tls" not found Apr 17 11:32:30.222033 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:30.222006 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7vxqv_57cb4093-bdc6-4637-9afc-7364349a96d4/dns-node-resolver/0.log" Apr 17 11:32:31.421618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:31.421594 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ts68p_3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4/node-ca/0.log" Apr 17 11:32:32.126015 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.125979 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cvp9f"] Apr 17 11:32:32.128697 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.128677 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.130920 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.130896 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 11:32:32.131159 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.131142 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 11:32:32.131656 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.131640 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 11:32:32.131736 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.131675 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 11:32:32.131736 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.131680 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-kg76k\"" Apr 17 11:32:32.135798 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.135768 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cvp9f"] Apr 17 11:32:32.258458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.258422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqr2l\" (UniqueName: \"kubernetes.io/projected/3479fb90-7107-42aa-ae3e-b6291ca30d3f-kube-api-access-qqr2l\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.258637 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.258478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-key\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.258637 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.258562 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-cabundle\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.359583 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.359544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqr2l\" (UniqueName: \"kubernetes.io/projected/3479fb90-7107-42aa-ae3e-b6291ca30d3f-kube-api-access-qqr2l\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.359748 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.359602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-key\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.359748 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.359653 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-cabundle\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.360398 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.360379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-cabundle\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.362062 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.362045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3479fb90-7107-42aa-ae3e-b6291ca30d3f-signing-key\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.368720 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.368697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqr2l\" (UniqueName: \"kubernetes.io/projected/3479fb90-7107-42aa-ae3e-b6291ca30d3f-kube-api-access-qqr2l\") pod \"service-ca-865cb79987-cvp9f\" (UID: \"3479fb90-7107-42aa-ae3e-b6291ca30d3f\") " pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.438146 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.438050 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-cvp9f" Apr 17 11:32:32.461038 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.461003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:32.461188 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:32.461167 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:32.461267 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:32.461251 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:40.461228731 +0000 UTC m=+133.989805970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:32.552923 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.552889 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cvp9f"] Apr 17 11:32:32.556616 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:32.556582 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3479fb90_7107_42aa_ae3e_b6291ca30d3f.slice/crio-dbf2d95a80b267485139d25f0214bc6d860869291510c6a283fd95ab2b7bd80e WatchSource:0}: Error finding container dbf2d95a80b267485139d25f0214bc6d860869291510c6a283fd95ab2b7bd80e: Status 404 returned error can't find the container with id dbf2d95a80b267485139d25f0214bc6d860869291510c6a283fd95ab2b7bd80e Apr 17 11:32:32.561608 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:32.561588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:32.561768 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:32.561748 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:32:32.561768 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:32.561768 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8648f7995c-dng8z: secret "image-registry-tls" not found Apr 17 11:32:32.561875 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:32.561831 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls podName:6b8643c8-eca1-4bb8-957a-ed244556d475 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:40.561808613 +0000 UTC m=+134.090385840 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls") pod "image-registry-8648f7995c-dng8z" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475") : secret "image-registry-tls" not found Apr 17 11:32:33.430366 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:33.430323 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-cvp9f" event={"ID":"3479fb90-7107-42aa-ae3e-b6291ca30d3f","Type":"ContainerStarted","Data":"c4c4f90e218333df18ad1b4017d70e4e7581a2cb1583b0f5cdf6505e34a14102"} Apr 17 11:32:33.430366 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:33.430367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-cvp9f" event={"ID":"3479fb90-7107-42aa-ae3e-b6291ca30d3f","Type":"ContainerStarted","Data":"dbf2d95a80b267485139d25f0214bc6d860869291510c6a283fd95ab2b7bd80e"} Apr 17 11:32:33.447655 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:33.447593 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-cvp9f" podStartSLOduration=1.447578322 podStartE2EDuration="1.447578322s" podCreationTimestamp="2026-04-17 11:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:32:33.446937865 +0000 UTC m=+126.975515115" watchObservedRunningTime="2026-04-17 11:32:33.447578322 +0000 UTC m=+126.976155574" Apr 17 11:32:36.899602 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:36.899543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:32:36.899995 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:36.899685 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:32:36.899995 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:36.899752 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs podName:343340da-6202-4b41-8b3d-4e0c0f72ecb6 nodeName:}" failed. No retries permitted until 2026-04-17 11:34:38.899737701 +0000 UTC m=+252.428314928 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs") pod "network-metrics-daemon-z52nx" (UID: "343340da-6202-4b41-8b3d-4e0c0f72ecb6") : secret "metrics-daemon-secret" not found Apr 17 11:32:40.529314 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.529277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:40.529776 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:40.529415 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:40.529776 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:32:40.529498 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls podName:6a5d3f15-276a-483f-8bff-f93a79e3882e nodeName:}" failed. No retries permitted until 2026-04-17 11:32:56.529475758 +0000 UTC m=+150.058052989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-jphln" (UID: "6a5d3f15-276a-483f-8bff-f93a79e3882e") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:32:40.630663 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.630626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:40.633070 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.633039 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"image-registry-8648f7995c-dng8z\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:40.670043 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.670013 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5ntx8\"" Apr 17 11:32:40.678599 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.678576 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:40.795569 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:40.795476 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:32:40.799245 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:40.799217 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8643c8_eca1_4bb8_957a_ed244556d475.slice/crio-20515735f9d921a13c47176e0e7056679443cb162edae3019b285442540f7f39 WatchSource:0}: Error finding container 20515735f9d921a13c47176e0e7056679443cb162edae3019b285442540f7f39: Status 404 returned error can't find the container with id 20515735f9d921a13c47176e0e7056679443cb162edae3019b285442540f7f39 Apr 17 11:32:41.451757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:41.451719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" event={"ID":"6b8643c8-eca1-4bb8-957a-ed244556d475","Type":"ContainerStarted","Data":"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0"} Apr 17 11:32:41.451757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:41.451754 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" event={"ID":"6b8643c8-eca1-4bb8-957a-ed244556d475","Type":"ContainerStarted","Data":"20515735f9d921a13c47176e0e7056679443cb162edae3019b285442540f7f39"} Apr 17 11:32:41.452027 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:41.451853 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:32:41.472869 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:41.472801 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" podStartSLOduration=17.472787709 podStartE2EDuration="17.472787709s" podCreationTimestamp="2026-04-17 11:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:32:41.472408596 +0000 UTC m=+135.000985844" watchObservedRunningTime="2026-04-17 11:32:41.472787709 +0000 UTC m=+135.001364957" Apr 17 11:32:53.070829 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.070793 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xs4lh"] Apr 17 11:32:53.073800 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.073780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.079990 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.079968 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:32:53.080249 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.080235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:32:53.080745 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.080728 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:32:53.080846 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.080831 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:32:53.088093 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.088071 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs4lh"] Apr 17 11:32:53.092506 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.092486 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4b7m5\"" Apr 17 11:32:53.125261 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.125229 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:32:53.153862 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.153833 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-756549b58b-rdkhs"] Apr 17 11:32:53.155820 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.155780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.170579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.170557 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-756549b58b-rdkhs"] Apr 17 11:32:53.228362 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.228334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/226b68ce-7c71-4b87-a853-b012721e7c68-crio-socket\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.228535 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.228377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/226b68ce-7c71-4b87-a853-b012721e7c68-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.228535 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.228452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/226b68ce-7c71-4b87-a853-b012721e7c68-data-volume\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.228535 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.228495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ctd\" (UniqueName: \"kubernetes.io/projected/226b68ce-7c71-4b87-a853-b012721e7c68-kube-api-access-65ctd\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.228647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.228533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/226b68ce-7c71-4b87-a853-b012721e7c68-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.329907 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.329822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-image-registry-private-configuration\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.329907 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.329858 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-trusted-ca\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.329929 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzfmz\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-kube-api-access-vzfmz\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.329983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-registry-certificates\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58df3935-ab94-49e4-9e5e-a716b5374775-ca-trust-extracted\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/226b68ce-7c71-4b87-a853-b012721e7c68-data-volume\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65ctd\" (UniqueName: \"kubernetes.io/projected/226b68ce-7c71-4b87-a853-b012721e7c68-kube-api-access-65ctd\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/226b68ce-7c71-4b87-a853-b012721e7c68-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330282 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/226b68ce-7c71-4b87-a853-b012721e7c68-crio-socket\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330282 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-installation-pull-secrets\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330354 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-registry-tls\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330354 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-bound-sa-token\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.330354 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/226b68ce-7c71-4b87-a853-b012721e7c68-crio-socket\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330487 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/226b68ce-7c71-4b87-a853-b012721e7c68-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330576 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330495 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/226b68ce-7c71-4b87-a853-b012721e7c68-data-volume\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.330914 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.330892 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/226b68ce-7c71-4b87-a853-b012721e7c68-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.332682 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.332654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/226b68ce-7c71-4b87-a853-b012721e7c68-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.342819 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.342793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ctd\" (UniqueName: \"kubernetes.io/projected/226b68ce-7c71-4b87-a853-b012721e7c68-kube-api-access-65ctd\") pod \"insights-runtime-extractor-xs4lh\" (UID: \"226b68ce-7c71-4b87-a853-b012721e7c68\") " pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.382902 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.382868 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs4lh" Apr 17 11:32:53.431934 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.431897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzfmz\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-kube-api-access-vzfmz\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432076 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.431961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-registry-certificates\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432076 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.431990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58df3935-ab94-49e4-9e5e-a716b5374775-ca-trust-extracted\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432241 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-installation-pull-secrets\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432318 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-registry-tls\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432318 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-bound-sa-token\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432417 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-image-registry-private-configuration\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432469 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432402 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-trusted-ca\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.432469 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.432421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58df3935-ab94-49e4-9e5e-a716b5374775-ca-trust-extracted\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.433098 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.433042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-registry-certificates\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.433679 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.433654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58df3935-ab94-49e4-9e5e-a716b5374775-trusted-ca\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.434866 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.434844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-installation-pull-secrets\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.435783 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.435742 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/58df3935-ab94-49e4-9e5e-a716b5374775-image-registry-private-configuration\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.436068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.436049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-registry-tls\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.442093 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.442038 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzfmz\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-kube-api-access-vzfmz\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.442410 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.442386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58df3935-ab94-49e4-9e5e-a716b5374775-bound-sa-token\") pod \"image-registry-756549b58b-rdkhs\" (UID: \"58df3935-ab94-49e4-9e5e-a716b5374775\") " pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.464154 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.464128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:53.508201 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.508125 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs4lh"] Apr 17 11:32:53.515473 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:53.515444 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226b68ce_7c71_4b87_a853_b012721e7c68.slice/crio-68ab70edc8ba9614d92dab960fc44ebdcb2d8be32bffe7165e5e5b9537478e0a WatchSource:0}: Error finding container 68ab70edc8ba9614d92dab960fc44ebdcb2d8be32bffe7165e5e5b9537478e0a: Status 404 returned error can't find the container with id 68ab70edc8ba9614d92dab960fc44ebdcb2d8be32bffe7165e5e5b9537478e0a Apr 17 11:32:53.589719 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:53.589690 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-756549b58b-rdkhs"] Apr 17 11:32:53.591975 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:53.591944 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58df3935_ab94_49e4_9e5e_a716b5374775.slice/crio-56819cbab9ed950376f090f91505b7cd1791053c163919e808a1e957586f0230 WatchSource:0}: Error finding container 56819cbab9ed950376f090f91505b7cd1791053c163919e808a1e957586f0230: Status 404 returned error can't find the container with id 56819cbab9ed950376f090f91505b7cd1791053c163919e808a1e957586f0230 Apr 17 11:32:54.489913 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.489818 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" event={"ID":"58df3935-ab94-49e4-9e5e-a716b5374775","Type":"ContainerStarted","Data":"e40a430f464a0d8a2764407c5f18b1605ec52af095878ccecd1917d8ae0249a7"} Apr 17 11:32:54.489913 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.489855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" event={"ID":"58df3935-ab94-49e4-9e5e-a716b5374775","Type":"ContainerStarted","Data":"56819cbab9ed950376f090f91505b7cd1791053c163919e808a1e957586f0230"} Apr 17 11:32:54.490357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.489942 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:32:54.491433 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.491405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs4lh" event={"ID":"226b68ce-7c71-4b87-a853-b012721e7c68","Type":"ContainerStarted","Data":"067eb1c6219f7b4a9984a96080cf55c770818518f341fa781f533b827b6f43bc"} Apr 17 11:32:54.491433 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.491432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs4lh" event={"ID":"226b68ce-7c71-4b87-a853-b012721e7c68","Type":"ContainerStarted","Data":"24fc60d0ac148f8a7c05c00c15547a3dea6cf1b98836ae7f1f079ef1f73adc95"} Apr 17 11:32:54.491606 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.491441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs4lh" event={"ID":"226b68ce-7c71-4b87-a853-b012721e7c68","Type":"ContainerStarted","Data":"68ab70edc8ba9614d92dab960fc44ebdcb2d8be32bffe7165e5e5b9537478e0a"} Apr 17 11:32:54.508097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:54.508050 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" podStartSLOduration=1.508035963 podStartE2EDuration="1.508035963s" podCreationTimestamp="2026-04-17 11:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:32:54.507310404 +0000 UTC m=+148.035887664" watchObservedRunningTime="2026-04-17 11:32:54.508035963 +0000 UTC m=+148.036613211" Apr 17 11:32:56.497579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.497540 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs4lh" event={"ID":"226b68ce-7c71-4b87-a853-b012721e7c68","Type":"ContainerStarted","Data":"cc904219f321d5369f395e22d5084d406c738a4c8f944c11b79b79efb8cf3990"} Apr 17 11:32:56.514497 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.514445 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xs4lh" podStartSLOduration=1.4477663 podStartE2EDuration="3.514429317s" podCreationTimestamp="2026-04-17 11:32:53 +0000 UTC" firstStartedPulling="2026-04-17 11:32:53.582098077 +0000 UTC m=+147.110675305" lastFinishedPulling="2026-04-17 11:32:55.648761092 +0000 UTC m=+149.177338322" observedRunningTime="2026-04-17 11:32:56.514047903 +0000 UTC m=+150.042625152" watchObservedRunningTime="2026-04-17 11:32:56.514429317 +0000 UTC m=+150.043006565" Apr 17 11:32:56.560478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.560440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:56.562961 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.562939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a5d3f15-276a-483f-8bff-f93a79e3882e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-jphln\" (UID: \"6a5d3f15-276a-483f-8bff-f93a79e3882e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:56.778173 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.778091 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-l7b5g\"" Apr 17 11:32:56.786752 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.786727 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" Apr 17 11:32:56.919702 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:56.919670 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln"] Apr 17 11:32:56.922631 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:32:56.922598 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5d3f15_276a_483f_8bff_f93a79e3882e.slice/crio-a9b5e745c85522370da6d76e7b04cad5ab68d73ad705b51da7faada53b7f4552 WatchSource:0}: Error finding container a9b5e745c85522370da6d76e7b04cad5ab68d73ad705b51da7faada53b7f4552: Status 404 returned error can't find the container with id a9b5e745c85522370da6d76e7b04cad5ab68d73ad705b51da7faada53b7f4552 Apr 17 11:32:57.501052 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:57.501015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" event={"ID":"6a5d3f15-276a-483f-8bff-f93a79e3882e","Type":"ContainerStarted","Data":"a9b5e745c85522370da6d76e7b04cad5ab68d73ad705b51da7faada53b7f4552"} Apr 17 11:32:58.504722 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:58.504680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" event={"ID":"6a5d3f15-276a-483f-8bff-f93a79e3882e","Type":"ContainerStarted","Data":"e1130333d3c34b82f7c40759b149a07de22586fabfaf28729a0e22785f7c4735"} Apr 17 11:32:58.521006 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:32:58.520955 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-jphln" podStartSLOduration=33.026625532 podStartE2EDuration="34.520937947s" podCreationTimestamp="2026-04-17 11:32:24 +0000 UTC" firstStartedPulling="2026-04-17 11:32:56.924502534 +0000 UTC m=+150.453079760" lastFinishedPulling="2026-04-17 11:32:58.418814921 +0000 UTC m=+151.947392175" observedRunningTime="2026-04-17 11:32:58.520113257 +0000 UTC m=+152.048690505" watchObservedRunningTime="2026-04-17 11:32:58.520937947 +0000 UTC m=+152.049515196" Apr 17 11:33:01.862342 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:01.862300 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cxbf8" podUID="66f573d5-80a9-4ecf-ad6b-6cf684898a74" Apr 17 11:33:01.876487 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:01.876449 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pdqss" podUID="e3bee7d3-b329-44aa-922f-a04ce5b599e7" Apr 17 11:33:02.514631 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:02.514599 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:33:02.514786 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:02.514662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:03.103531 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:03.103479 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z52nx" podUID="343340da-6202-4b41-8b3d-4e0c0f72ecb6" Apr 17 11:33:03.130020 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:03.129988 2570 patch_prober.go:28] interesting pod/image-registry-8648f7995c-dng8z container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:33:03.130152 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:03.130035 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:33:06.741561 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:06.741489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:33:06.741959 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:06.741631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:06.743997 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:06.743968 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66f573d5-80a9-4ecf-ad6b-6cf684898a74-metrics-tls\") pod \"dns-default-cxbf8\" (UID: \"66f573d5-80a9-4ecf-ad6b-6cf684898a74\") " pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:06.744123 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:06.744038 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3bee7d3-b329-44aa-922f-a04ce5b599e7-cert\") pod \"ingress-canary-pdqss\" (UID: \"e3bee7d3-b329-44aa-922f-a04ce5b599e7\") " pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:33:07.019754 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.019680 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ltml9\"" Apr 17 11:33:07.020536 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.020498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hmj98\"" Apr 17 11:33:07.026612 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.026573 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:07.026612 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.026582 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pdqss" Apr 17 11:33:07.162082 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.162043 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cxbf8"] Apr 17 11:33:07.165693 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:07.165655 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66f573d5_80a9_4ecf_ad6b_6cf684898a74.slice/crio-b1b3f62df5e644450bd22b8e431293d26eea12c41dd44d6d2837d759cca416d7 WatchSource:0}: Error finding container b1b3f62df5e644450bd22b8e431293d26eea12c41dd44d6d2837d759cca416d7: Status 404 returned error can't find the container with id b1b3f62df5e644450bd22b8e431293d26eea12c41dd44d6d2837d759cca416d7 Apr 17 11:33:07.180077 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.180052 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pdqss"] Apr 17 11:33:07.183173 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:07.183146 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bee7d3_b329_44aa_922f_a04ce5b599e7.slice/crio-40de1d86cb6d03fed031219419121e3b75546d8aa0e56540232fe3fca96a99fb WatchSource:0}: Error finding container 40de1d86cb6d03fed031219419121e3b75546d8aa0e56540232fe3fca96a99fb: Status 404 returned error can't find the container with id 40de1d86cb6d03fed031219419121e3b75546d8aa0e56540232fe3fca96a99fb Apr 17 11:33:07.309786 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.309712 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rjsqn"] Apr 17 11:33:07.313447 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.313423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.315832 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.315806 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:33:07.317431 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.317410 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:33:07.317572 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.317408 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:33:07.317784 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.317643 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:33:07.317784 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.317693 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wj9ml\"" Apr 17 11:33:07.346643 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346792 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346680 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-wtmp\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346792 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346871 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-textfile\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346871 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg24\" (UniqueName: \"kubernetes.io/projected/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-kube-api-access-fpg24\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346871 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346841 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-sys\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-metrics-client-ca\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.346991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.346933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-root\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448286 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-metrics-client-ca\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-root\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448332 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-wtmp\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-root\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448485 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448765 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-textfile\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448765 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg24\" (UniqueName: \"kubernetes.io/projected/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-kube-api-access-fpg24\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448765 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-wtmp\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448765 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-sys\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.448765 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.449016 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:07.448802 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:33:07.449016 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448836 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-textfile\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.449016 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:07.448879 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls podName:34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:07.948859251 +0000 UTC m=+161.477436493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls") pod "node-exporter-rjsqn" (UID: "34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8") : secret "node-exporter-tls" not found Apr 17 11:33:07.449016 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-sys\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.449016 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.448950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-metrics-client-ca\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.449370 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.449351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.451161 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.451139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.459506 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.459486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg24\" (UniqueName: \"kubernetes.io/projected/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-kube-api-access-fpg24\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.529195 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.529159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxbf8" event={"ID":"66f573d5-80a9-4ecf-ad6b-6cf684898a74","Type":"ContainerStarted","Data":"b1b3f62df5e644450bd22b8e431293d26eea12c41dd44d6d2837d759cca416d7"} Apr 17 11:33:07.530217 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.530186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pdqss" event={"ID":"e3bee7d3-b329-44aa-922f-a04ce5b599e7","Type":"ContainerStarted","Data":"40de1d86cb6d03fed031219419121e3b75546d8aa0e56540232fe3fca96a99fb"} Apr 17 11:33:07.953534 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:07.953189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:07.953534 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:07.953384 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:33:07.953534 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:07.953446 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls podName:34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:08.95342581 +0000 UTC m=+162.482003040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls") pod "node-exporter-rjsqn" (UID: "34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8") : secret "node-exporter-tls" not found Apr 17 11:33:08.383891 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.383851 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:33:08.387068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.387034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.391739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.390857 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:33:08.391739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.391083 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:33:08.391739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.391268 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8jts2\"" Apr 17 11:33:08.391739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.391425 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:33:08.391739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.391603 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:33:08.393541 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.392346 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:33:08.393541 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.392454 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:33:08.393541 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.392761 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:33:08.393541 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.392953 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:33:08.393541 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.393184 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:33:08.400650 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.400610 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:33:08.460031 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.459991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460235 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460235 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460235 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460176 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460467 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460245 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460467 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460467 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460467 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqlk\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460687 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460477 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460687 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460687 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460687 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.460687 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.460667 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562028 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.561998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqlk\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562063 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562221 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:08.562214 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle podName:8098d7c4-6d94-4d0e-95ae-67ca813d7557 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:09.062191075 +0000 UTC m=+162.590768323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557") : configmap references non-existent config key: ca-bundle.crt Apr 17 11:33:08.562543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562260 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562700 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562700 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562700 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.562700 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.562686 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.566387 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.565491 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.566387 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.565986 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.566709 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.566540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.567362 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.567306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.567565 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.567463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.567842 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.567724 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.567842 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.567785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.567842 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.567818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.571618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.568109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.571618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.569769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.572733 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.572695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.574469 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.574444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqlk\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:08.970981 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.970929 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:08.973567 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:08.973540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8-node-exporter-tls\") pod \"node-exporter-rjsqn\" (UID: \"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8\") " pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:09.071933 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.071891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:09.072932 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.072901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:09.129505 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.129472 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rjsqn" Apr 17 11:33:09.167857 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:09.167824 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e2ff2c_c93b_4f81_b6bd_478fd6c17ae8.slice/crio-4e59ad23fe7a1f1e6ec544952836d99b93904d9053082b34a7f7c5f79493f8b9 WatchSource:0}: Error finding container 4e59ad23fe7a1f1e6ec544952836d99b93904d9053082b34a7f7c5f79493f8b9: Status 404 returned error can't find the container with id 4e59ad23fe7a1f1e6ec544952836d99b93904d9053082b34a7f7c5f79493f8b9 Apr 17 11:33:09.303241 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.303210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:33:09.472012 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.471980 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:33:09.475471 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:09.475443 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8098d7c4_6d94_4d0e_95ae_67ca813d7557.slice/crio-9731d39f6d8468cf83d076c490cf88364bdc815cf8dbf149b04ee527ae07d062 WatchSource:0}: Error finding container 9731d39f6d8468cf83d076c490cf88364bdc815cf8dbf149b04ee527ae07d062: Status 404 returned error can't find the container with id 9731d39f6d8468cf83d076c490cf88364bdc815cf8dbf149b04ee527ae07d062 Apr 17 11:33:09.537195 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.537160 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pdqss" event={"ID":"e3bee7d3-b329-44aa-922f-a04ce5b599e7","Type":"ContainerStarted","Data":"efa36277f4e57857fd5ad3548f03199155d9da5082c35f9f4592ba3a89f64557"} Apr 17 11:33:09.538236 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.538213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"9731d39f6d8468cf83d076c490cf88364bdc815cf8dbf149b04ee527ae07d062"} Apr 17 11:33:09.539693 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.539671 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxbf8" event={"ID":"66f573d5-80a9-4ecf-ad6b-6cf684898a74","Type":"ContainerStarted","Data":"67e6ef7a90aa222a75d52d7761f61bbd0271c36b249840819d6d4760ddfc35dd"} Apr 17 11:33:09.539877 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.539698 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cxbf8" event={"ID":"66f573d5-80a9-4ecf-ad6b-6cf684898a74","Type":"ContainerStarted","Data":"f340af9dbc67d7a89ff6b585c3565c2c4777377cf10f0166b8856725dfb9e082"} Apr 17 11:33:09.539877 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.539790 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:09.540713 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.540694 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjsqn" event={"ID":"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8","Type":"ContainerStarted","Data":"4e59ad23fe7a1f1e6ec544952836d99b93904d9053082b34a7f7c5f79493f8b9"} Apr 17 11:33:09.553496 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.553456 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pdqss" podStartSLOduration=129.567511465 podStartE2EDuration="2m11.55344212s" podCreationTimestamp="2026-04-17 11:30:58 +0000 UTC" firstStartedPulling="2026-04-17 11:33:07.185137383 +0000 UTC m=+160.713714622" lastFinishedPulling="2026-04-17 11:33:09.171068048 +0000 UTC m=+162.699645277" observedRunningTime="2026-04-17 11:33:09.552895751 +0000 UTC m=+163.081473002" watchObservedRunningTime="2026-04-17 11:33:09.55344212 +0000 UTC m=+163.082019364" Apr 17 11:33:09.569551 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:09.569494 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cxbf8" podStartSLOduration=129.570400029 podStartE2EDuration="2m11.569477476s" podCreationTimestamp="2026-04-17 11:30:58 +0000 UTC" firstStartedPulling="2026-04-17 11:33:07.167793444 +0000 UTC m=+160.696370672" lastFinishedPulling="2026-04-17 11:33:09.166870888 +0000 UTC m=+162.695448119" observedRunningTime="2026-04-17 11:33:09.569386141 +0000 UTC m=+163.097963392" watchObservedRunningTime="2026-04-17 11:33:09.569477476 +0000 UTC m=+163.098054726" Apr 17 11:33:10.265043 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.264763 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" podUID="8d6a3bcb-48e8-419c-bf32-5d7d48940e2c" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 17 11:33:10.345111 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.345076 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-59ccbf6d54-57ct9"] Apr 17 11:33:10.348334 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.348256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.353650 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.353626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bjh8gsim2tslg\"" Apr 17 11:33:10.354080 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.354058 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 11:33:10.354297 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.354278 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 11:33:10.354297 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.354292 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-m84xs\"" Apr 17 11:33:10.354605 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.354589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 11:33:10.355358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.355309 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 11:33:10.357301 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.357251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 11:33:10.368331 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.368297 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59ccbf6d54-57ct9"] Apr 17 11:33:10.382922 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.382892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383087 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.382946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a19e39a-6f59-44dd-ada4-c336a0970663-metrics-client-ca\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383087 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383087 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tntg\" (UniqueName: \"kubernetes.io/projected/6a19e39a-6f59-44dd-ada4-c336a0970663-kube-api-access-8tntg\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383087 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-grpc-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383322 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383322 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.383322 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.383241 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484173 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tntg\" (UniqueName: \"kubernetes.io/projected/6a19e39a-6f59-44dd-ada4-c336a0970663-kube-api-access-8tntg\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-grpc-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484532 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484532 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.484532 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.484453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a19e39a-6f59-44dd-ada4-c336a0970663-metrics-client-ca\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.485331 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.485279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a19e39a-6f59-44dd-ada4-c336a0970663-metrics-client-ca\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.487388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.487335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-grpc-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.487388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.487364 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.487585 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.487392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.488409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.488366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.488558 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.488533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-tls\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.489114 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.489088 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6a19e39a-6f59-44dd-ada4-c336a0970663-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.493930 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.493908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tntg\" (UniqueName: \"kubernetes.io/projected/6a19e39a-6f59-44dd-ada4-c336a0970663-kube-api-access-8tntg\") pod \"thanos-querier-59ccbf6d54-57ct9\" (UID: \"6a19e39a-6f59-44dd-ada4-c336a0970663\") " pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.547012 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.545987 2570 generic.go:358] "Generic (PLEG): container finished" podID="8d6a3bcb-48e8-419c-bf32-5d7d48940e2c" containerID="3d6390627b2ae39db758d31a98a306773fad90f298e4be440da598aa1e1a4030" exitCode=1 Apr 17 11:33:10.547012 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.546065 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" event={"ID":"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c","Type":"ContainerDied","Data":"3d6390627b2ae39db758d31a98a306773fad90f298e4be440da598aa1e1a4030"} Apr 17 11:33:10.547012 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.546432 2570 scope.go:117] "RemoveContainer" containerID="3d6390627b2ae39db758d31a98a306773fad90f298e4be440da598aa1e1a4030" Apr 17 11:33:10.549293 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.549260 2570 generic.go:358] "Generic (PLEG): container finished" podID="34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8" containerID="5b4f53334cca05f5ac55ea257964341868945d038e005751948abbda504d315d" exitCode=0 Apr 17 11:33:10.549433 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.549296 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjsqn" event={"ID":"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8","Type":"ContainerDied","Data":"5b4f53334cca05f5ac55ea257964341868945d038e005751948abbda504d315d"} Apr 17 11:33:10.660060 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.659472 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:10.789769 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:10.789738 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59ccbf6d54-57ct9"] Apr 17 11:33:10.793771 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:10.793741 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a19e39a_6f59_44dd_ada4_c336a0970663.slice/crio-495959b3f7fde1a1d94c1bfa7f0fd5f3ae38f60aa003a85a2d9ae053e5e7dc52 WatchSource:0}: Error finding container 495959b3f7fde1a1d94c1bfa7f0fd5f3ae38f60aa003a85a2d9ae053e5e7dc52: Status 404 returned error can't find the container with id 495959b3f7fde1a1d94c1bfa7f0fd5f3ae38f60aa003a85a2d9ae053e5e7dc52 Apr 17 11:33:11.554795 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.554756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" event={"ID":"8d6a3bcb-48e8-419c-bf32-5d7d48940e2c","Type":"ContainerStarted","Data":"406706acd839e53f0ea1ffe4c04bad3f9d873ee33bf6840f592a276fed5951b6"} Apr 17 11:33:11.555244 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.555077 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:33:11.555891 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.555868 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bdcc4dcd4-gdjsb" Apr 17 11:33:11.557335 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.557278 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjsqn" event={"ID":"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8","Type":"ContainerStarted","Data":"3c50266fe830e341846e5c353b5680093d821b900669e52e8bd6e45967934507"} Apr 17 11:33:11.557335 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.557306 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjsqn" event={"ID":"34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8","Type":"ContainerStarted","Data":"38c412f591852a214cfe542396acd228427874972f7b5aacbf85d9cdafa4203e"} Apr 17 11:33:11.558605 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.558580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"495959b3f7fde1a1d94c1bfa7f0fd5f3ae38f60aa003a85a2d9ae053e5e7dc52"} Apr 17 11:33:11.559865 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.559838 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="99d1e7e28e031f192caa1130d21c729444e9b3422002d2812f0dd9a4745c59f2" exitCode=0 Apr 17 11:33:11.559958 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.559880 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"99d1e7e28e031f192caa1130d21c729444e9b3422002d2812f0dd9a4745c59f2"} Apr 17 11:33:11.605175 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:11.605114 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rjsqn" podStartSLOduration=3.831093772 podStartE2EDuration="4.605095443s" podCreationTimestamp="2026-04-17 11:33:07 +0000 UTC" firstStartedPulling="2026-04-17 11:33:09.170615729 +0000 UTC m=+162.699192964" lastFinishedPulling="2026-04-17 11:33:09.944617405 +0000 UTC m=+163.473194635" observedRunningTime="2026-04-17 11:33:11.604259484 +0000 UTC m=+165.132836748" watchObservedRunningTime="2026-04-17 11:33:11.605095443 +0000 UTC m=+165.133672693" Apr 17 11:33:12.565527 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:12.565467 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"927a0be998fcb3872961978ff548d147c4ed277439affe59c7abbe7057ebe427"} Apr 17 11:33:12.565527 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:12.565532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"a00f4963c008189b4c726de2f5b6433fc7569bc0c30da6bebc1899713719bc98"} Apr 17 11:33:12.566027 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:12.565550 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"9890d74df63139678137e7fa7c31726382c46e0bec24d710b68bb62550277f9f"} Apr 17 11:33:13.129618 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.129591 2570 patch_prober.go:28] interesting pod/image-registry-8648f7995c-dng8z container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:33:13.129735 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.129643 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:33:13.468883 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.468841 2570 patch_prober.go:28] interesting pod/image-registry-756549b58b-rdkhs container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:33:13.469046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.468907 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" podUID="58df3935-ab94-49e4-9e5e-a716b5374775" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:33:13.572556 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.572495 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"745bf27df552896d2c78943ecefcc530ef69e45e7d2f08cb0b9e345e83ecffe4"} Apr 17 11:33:13.572556 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.572559 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"38682e33f208a85c9a688f92a1c16b001628df59f968720322515d417bb94db6"} Apr 17 11:33:13.573046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.572575 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"a77fa8737aa9f96d49595f691a7344c79a6cd41f3621b3d25def8236800485b9"} Apr 17 11:33:13.573046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.572588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"64f4c5ab3be57636c6e3e804d355a057b8ad60244a23a0289cf5eb0d2d5bda46"} Apr 17 11:33:13.573046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:13.572601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"ae399d5c0c103202a519bb999d8b5cc1edeaa5e799bc684f13f16f817c87c514"} Apr 17 11:33:14.577953 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.577909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"644aaf1662e7dfbb2c03354a2d5b8afed99c8a1cfc272d6ae0805eec90c9e135"} Apr 17 11:33:14.577953 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.577952 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"6c4bf3b44e32347d45a8fae9ededcbf96e8246acca9bdd7558f0b05275d55086"} Apr 17 11:33:14.578440 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.577966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" event={"ID":"6a19e39a-6f59-44dd-ada4-c336a0970663","Type":"ContainerStarted","Data":"b3c64a2b09def92b2b5f7558872582bf803f5e2acf42250da5858fb50724d1f2"} Apr 17 11:33:14.578440 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.578068 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:14.580610 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.580584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerStarted","Data":"f0224d1b309f20cbab064a9b1c96ade7925527da9f6be43917fc036c0e22f9db"} Apr 17 11:33:14.600981 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.600924 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" podStartSLOduration=1.775581503 podStartE2EDuration="4.600909047s" podCreationTimestamp="2026-04-17 11:33:10 +0000 UTC" firstStartedPulling="2026-04-17 11:33:10.7960798 +0000 UTC m=+164.324657026" lastFinishedPulling="2026-04-17 11:33:13.62140734 +0000 UTC m=+167.149984570" observedRunningTime="2026-04-17 11:33:14.599286198 +0000 UTC m=+168.127863446" watchObservedRunningTime="2026-04-17 11:33:14.600909047 +0000 UTC m=+168.129486296" Apr 17 11:33:14.631868 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:14.631805 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.359086921 podStartE2EDuration="6.631791359s" podCreationTimestamp="2026-04-17 11:33:08 +0000 UTC" firstStartedPulling="2026-04-17 11:33:09.477403828 +0000 UTC m=+163.005981055" lastFinishedPulling="2026-04-17 11:33:13.750108261 +0000 UTC m=+167.278685493" observedRunningTime="2026-04-17 11:33:14.631357517 +0000 UTC m=+168.159934766" watchObservedRunningTime="2026-04-17 11:33:14.631791359 +0000 UTC m=+168.160368611" Apr 17 11:33:15.497974 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:15.497940 2570 patch_prober.go:28] interesting pod/image-registry-756549b58b-rdkhs container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:33:15.498140 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:15.497992 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" podUID="58df3935-ab94-49e4-9e5e-a716b5374775" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:33:17.089597 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:17.089563 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:33:18.143879 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.143841 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" containerID="cri-o://89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0" gracePeriod=30 Apr 17 11:33:18.378836 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.378811 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:33:18.455543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455439 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455480 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455556 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455579 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455625 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455659 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9ht\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455687 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.455744 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455718 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration\") pod \"6b8643c8-eca1-4bb8-957a-ed244556d475\" (UID: \"6b8643c8-eca1-4bb8-957a-ed244556d475\") " Apr 17 11:33:18.456035 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.455951 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:33:18.456376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.456337 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:33:18.458225 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.458197 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht" (OuterVolumeSpecName: "kube-api-access-6r9ht") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "kube-api-access-6r9ht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:33:18.458328 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.458212 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:33:18.458328 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.458233 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:33:18.458328 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.458300 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:33:18.458854 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.458833 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:33:18.464628 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.464499 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6b8643c8-eca1-4bb8-957a-ed244556d475" (UID: "6b8643c8-eca1-4bb8-957a-ed244556d475"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:33:18.556347 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556310 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-bound-sa-token\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556347 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556341 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6r9ht\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-kube-api-access-6r9ht\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556347 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556351 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-installation-pull-secrets\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556596 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556360 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b8643c8-eca1-4bb8-957a-ed244556d475-image-registry-private-configuration\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556596 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556371 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-certificates\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556596 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556380 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b8643c8-eca1-4bb8-957a-ed244556d475-ca-trust-extracted\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556596 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556389 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b8643c8-eca1-4bb8-957a-ed244556d475-registry-tls\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.556596 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.556397 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b8643c8-eca1-4bb8-957a-ed244556d475-trusted-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:33:18.598656 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.598622 2570 generic.go:358] "Generic (PLEG): container finished" podID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerID="89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0" exitCode=0 Apr 17 11:33:18.598777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.598681 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" Apr 17 11:33:18.598777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.598697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" event={"ID":"6b8643c8-eca1-4bb8-957a-ed244556d475","Type":"ContainerDied","Data":"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0"} Apr 17 11:33:18.598777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.598736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8648f7995c-dng8z" event={"ID":"6b8643c8-eca1-4bb8-957a-ed244556d475","Type":"ContainerDied","Data":"20515735f9d921a13c47176e0e7056679443cb162edae3019b285442540f7f39"} Apr 17 11:33:18.598777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.598754 2570 scope.go:117] "RemoveContainer" containerID="89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0" Apr 17 11:33:18.607192 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.607170 2570 scope.go:117] "RemoveContainer" containerID="89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0" Apr 17 11:33:18.607444 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:33:18.607423 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0\": container with ID starting with 89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0 not found: ID does not exist" containerID="89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0" Apr 17 11:33:18.607533 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.607456 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0"} err="failed to get container status \"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0\": rpc error: code = NotFound desc = could not find container \"89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0\": container with ID starting with 89683853c43cf5a2617a03f21f26d0aa30c33a5ba1dae9f6270a948d76e753c0 not found: ID does not exist" Apr 17 11:33:18.616992 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.616965 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:33:18.619082 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:18.619052 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8648f7995c-dng8z"] Apr 17 11:33:19.092455 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:19.092419 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" path="/var/lib/kubelet/pods/6b8643c8-eca1-4bb8-957a-ed244556d475/volumes" Apr 17 11:33:19.551617 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:19.551539 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cxbf8" Apr 17 11:33:20.595233 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:20.595206 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-59ccbf6d54-57ct9" Apr 17 11:33:21.286876 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.286845 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-k5dth"] Apr 17 11:33:21.287213 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.287175 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" Apr 17 11:33:21.287291 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.287218 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" Apr 17 11:33:21.287322 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.287308 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b8643c8-eca1-4bb8-957a-ed244556d475" containerName="registry" Apr 17 11:33:21.290594 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.290572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:21.292705 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.292682 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:33:21.292839 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.292712 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:33:21.292839 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.292723 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-zv746\"" Apr 17 11:33:21.299941 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.299919 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-k5dth"] Apr 17 11:33:21.386600 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.386567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flnw5\" (UniqueName: \"kubernetes.io/projected/41962a97-4ad6-4005-b2ed-b2a0a463e7e0-kube-api-access-flnw5\") pod \"downloads-6bcc868b7-k5dth\" (UID: \"41962a97-4ad6-4005-b2ed-b2a0a463e7e0\") " pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:21.487891 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.487856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flnw5\" (UniqueName: \"kubernetes.io/projected/41962a97-4ad6-4005-b2ed-b2a0a463e7e0-kube-api-access-flnw5\") pod \"downloads-6bcc868b7-k5dth\" (UID: \"41962a97-4ad6-4005-b2ed-b2a0a463e7e0\") " pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:21.495774 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.495740 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flnw5\" (UniqueName: \"kubernetes.io/projected/41962a97-4ad6-4005-b2ed-b2a0a463e7e0-kube-api-access-flnw5\") pod \"downloads-6bcc868b7-k5dth\" (UID: \"41962a97-4ad6-4005-b2ed-b2a0a463e7e0\") " pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:21.599745 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.599701 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:21.721032 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:21.720992 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-k5dth"] Apr 17 11:33:21.724949 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:21.724919 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41962a97_4ad6_4005_b2ed_b2a0a463e7e0.slice/crio-008f40e44796d0b5dca4db8b7a42a3dadc8345ccba94d66d9b00b17954c0813b WatchSource:0}: Error finding container 008f40e44796d0b5dca4db8b7a42a3dadc8345ccba94d66d9b00b17954c0813b: Status 404 returned error can't find the container with id 008f40e44796d0b5dca4db8b7a42a3dadc8345ccba94d66d9b00b17954c0813b Apr 17 11:33:22.612386 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:22.612351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-k5dth" event={"ID":"41962a97-4ad6-4005-b2ed-b2a0a463e7e0","Type":"ContainerStarted","Data":"008f40e44796d0b5dca4db8b7a42a3dadc8345ccba94d66d9b00b17954c0813b"} Apr 17 11:33:23.468974 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:23.468941 2570 patch_prober.go:28] interesting pod/image-registry-756549b58b-rdkhs container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:33:23.469149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:23.469002 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" podUID="58df3935-ab94-49e4-9e5e-a716b5374775" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:33:25.498376 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:25.498346 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-756549b58b-rdkhs" Apr 17 11:33:30.805032 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.804997 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:33:30.811654 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.811630 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.814163 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.814136 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 11:33:30.814278 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.814144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 11:33:30.814998 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.814976 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 11:33:30.815154 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.815002 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7sz8w\"" Apr 17 11:33:30.815154 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.815083 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 11:33:30.815154 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.814986 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 11:33:30.818574 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.818547 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:33:30.979651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.979813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89jkx\" (UniqueName: \"kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.979813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.979917 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.979975 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:30.980045 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:30.979997 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.080900 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.080865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081066 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.080957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89jkx\" (UniqueName: \"kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081066 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.080986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081066 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081222 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081222 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081896 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.081896 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.082083 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.081906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.083938 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.083914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.084076 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.083957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.089559 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.089504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89jkx\" (UniqueName: \"kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx\") pod \"console-5b8d456b7d-qdr76\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.122195 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.122159 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:31.277328 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.277298 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:33:31.280167 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:31.280138 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b02b28_c15c_430b_8af4_f9b1bc590f21.slice/crio-cd89ea404d2724d3141a490c3d4cfb743b7cb29e70da04d006293bba7af8c7b7 WatchSource:0}: Error finding container cd89ea404d2724d3141a490c3d4cfb743b7cb29e70da04d006293bba7af8c7b7: Status 404 returned error can't find the container with id cd89ea404d2724d3141a490c3d4cfb743b7cb29e70da04d006293bba7af8c7b7 Apr 17 11:33:31.640674 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:31.640639 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8d456b7d-qdr76" event={"ID":"65b02b28-c15c-430b-8af4-f9b1bc590f21","Type":"ContainerStarted","Data":"cd89ea404d2724d3141a490c3d4cfb743b7cb29e70da04d006293bba7af8c7b7"} Apr 17 11:33:38.664695 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:38.664653 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-k5dth" event={"ID":"41962a97-4ad6-4005-b2ed-b2a0a463e7e0","Type":"ContainerStarted","Data":"fed94baf74b0251fe4c07fbdfbc1a3a14b033b08811866746b1b526723e67e35"} Apr 17 11:33:38.665176 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:38.665014 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:38.678887 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:38.678853 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-k5dth" Apr 17 11:33:38.683319 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:38.683266 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-k5dth" podStartSLOduration=1.347895816 podStartE2EDuration="17.683246976s" podCreationTimestamp="2026-04-17 11:33:21 +0000 UTC" firstStartedPulling="2026-04-17 11:33:21.726545743 +0000 UTC m=+175.255122969" lastFinishedPulling="2026-04-17 11:33:38.061896884 +0000 UTC m=+191.590474129" observedRunningTime="2026-04-17 11:33:38.682892962 +0000 UTC m=+192.211470210" watchObservedRunningTime="2026-04-17 11:33:38.683246976 +0000 UTC m=+192.211824226" Apr 17 11:33:39.585853 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.585787 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:33:39.589647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.589611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.598360 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.598150 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 11:33:39.599718 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.599684 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:33:39.661064 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.660968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661064 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661081 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4snx\" (UniqueName: \"kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.661478 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.661244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.762777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762698 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.762777 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.763377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4snx\" (UniqueName: \"kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.763377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.763377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.763377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762940 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.763377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.762992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.764384 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.764244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.764384 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.764261 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.764709 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.764604 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.764776 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.764729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.766839 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.766776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.766839 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.766778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.773121 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.773096 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4snx\" (UniqueName: \"kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx\") pod \"console-77b4fdfbf6-25lvm\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:39.905598 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:39.905497 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:41.035874 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:41.035570 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:33:41.038843 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:33:41.038805 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedda8085_a282_45d9_98c9_ca1700d5801d.slice/crio-9df2ad5b814be520ff8a3d6aef5c9b54590444c45e36e3c4d4058001ec241337 WatchSource:0}: Error finding container 9df2ad5b814be520ff8a3d6aef5c9b54590444c45e36e3c4d4058001ec241337: Status 404 returned error can't find the container with id 9df2ad5b814be520ff8a3d6aef5c9b54590444c45e36e3c4d4058001ec241337 Apr 17 11:33:41.681897 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:41.681858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b4fdfbf6-25lvm" event={"ID":"edda8085-a282-45d9-98c9-ca1700d5801d","Type":"ContainerStarted","Data":"9df2ad5b814be520ff8a3d6aef5c9b54590444c45e36e3c4d4058001ec241337"} Apr 17 11:33:41.683669 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:41.683637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8d456b7d-qdr76" event={"ID":"65b02b28-c15c-430b-8af4-f9b1bc590f21","Type":"ContainerStarted","Data":"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc"} Apr 17 11:33:41.714433 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:41.714386 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b8d456b7d-qdr76" podStartSLOduration=1.6594151 podStartE2EDuration="11.714371608s" podCreationTimestamp="2026-04-17 11:33:30 +0000 UTC" firstStartedPulling="2026-04-17 11:33:31.282654225 +0000 UTC m=+184.811231456" lastFinishedPulling="2026-04-17 11:33:41.337610721 +0000 UTC m=+194.866187964" observedRunningTime="2026-04-17 11:33:41.712779827 +0000 UTC m=+195.241357083" watchObservedRunningTime="2026-04-17 11:33:41.714371608 +0000 UTC m=+195.242948857" Apr 17 11:33:42.688628 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:42.688583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b4fdfbf6-25lvm" event={"ID":"edda8085-a282-45d9-98c9-ca1700d5801d","Type":"ContainerStarted","Data":"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3"} Apr 17 11:33:42.706577 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:42.706495 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b4fdfbf6-25lvm" podStartSLOduration=3.142869456 podStartE2EDuration="3.706473439s" podCreationTimestamp="2026-04-17 11:33:39 +0000 UTC" firstStartedPulling="2026-04-17 11:33:41.041020499 +0000 UTC m=+194.569597732" lastFinishedPulling="2026-04-17 11:33:41.604624477 +0000 UTC m=+195.133201715" observedRunningTime="2026-04-17 11:33:42.705429023 +0000 UTC m=+196.234006272" watchObservedRunningTime="2026-04-17 11:33:42.706473439 +0000 UTC m=+196.235050689" Apr 17 11:33:48.705697 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:48.705663 2570 generic.go:358] "Generic (PLEG): container finished" podID="74b00ba1-c80d-4505-a365-8d7a3c6c7f9a" containerID="639fbaa3bd3e2065c43414fc669248b323486e801452b48b6332f5278cc40721" exitCode=0 Apr 17 11:33:48.706162 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:48.705742 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" event={"ID":"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a","Type":"ContainerDied","Data":"639fbaa3bd3e2065c43414fc669248b323486e801452b48b6332f5278cc40721"} Apr 17 11:33:48.706162 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:48.706085 2570 scope.go:117] "RemoveContainer" containerID="639fbaa3bd3e2065c43414fc669248b323486e801452b48b6332f5278cc40721" Apr 17 11:33:49.710652 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:49.710603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wv6fb" event={"ID":"74b00ba1-c80d-4505-a365-8d7a3c6c7f9a","Type":"ContainerStarted","Data":"af2638bac9992ae0ced99ebba64f8907b00702d2d776446fcd7d4aa8265868f4"} Apr 17 11:33:49.906437 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:49.906398 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:49.906437 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:49.906438 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:49.911256 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:49.911230 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:50.717111 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:50.717083 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:33:50.765193 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:50.765150 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:33:51.123241 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:51.123199 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:33:54.725123 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:54.725083 2570 generic.go:358] "Generic (PLEG): container finished" podID="acf5f208-16b3-41eb-b3eb-0b10391e7e74" containerID="3b2e00027d2bd2182dd8cb95864e83621412aecc859e01a9aa8f84269b27176e" exitCode=0 Apr 17 11:33:54.725599 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:54.725157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" event={"ID":"acf5f208-16b3-41eb-b3eb-0b10391e7e74","Type":"ContainerDied","Data":"3b2e00027d2bd2182dd8cb95864e83621412aecc859e01a9aa8f84269b27176e"} Apr 17 11:33:54.725599 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:54.725480 2570 scope.go:117] "RemoveContainer" containerID="3b2e00027d2bd2182dd8cb95864e83621412aecc859e01a9aa8f84269b27176e" Apr 17 11:33:55.729925 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:33:55.729885 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n7ngq" event={"ID":"acf5f208-16b3-41eb-b3eb-0b10391e7e74","Type":"ContainerStarted","Data":"a8548a4ea7a1d913d52fc69f875784a9e4908a92b54655523da9bdecfd2bdea7"} Apr 17 11:34:15.784333 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:15.784276 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b8d456b7d-qdr76" podUID="65b02b28-c15c-430b-8af4-f9b1bc590f21" containerName="console" containerID="cri-o://e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc" gracePeriod=15 Apr 17 11:34:16.035461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.035398 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b8d456b7d-qdr76_65b02b28-c15c-430b-8af4-f9b1bc590f21/console/0.log" Apr 17 11:34:16.035461 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.035458 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:34:16.195388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195357 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195391 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195449 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195473 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89jkx\" (UniqueName: \"kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195497 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195552 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert\") pod \"65b02b28-c15c-430b-8af4-f9b1bc590f21\" (UID: \"65b02b28-c15c-430b-8af4-f9b1bc590f21\") " Apr 17 11:34:16.195888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195859 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config" (OuterVolumeSpecName: "console-config") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:16.195963 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195942 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca" (OuterVolumeSpecName: "service-ca") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:16.196013 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.195990 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:16.197832 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.197797 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:16.197945 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.197833 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx" (OuterVolumeSpecName: "kube-api-access-89jkx") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "kube-api-access-89jkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:34:16.197945 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.197848 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "65b02b28-c15c-430b-8af4-f9b1bc590f21" (UID: "65b02b28-c15c-430b-8af4-f9b1bc590f21"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296900 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296931 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-oauth-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296941 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65b02b28-c15c-430b-8af4-f9b1bc590f21-console-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296951 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89jkx\" (UniqueName: \"kubernetes.io/projected/65b02b28-c15c-430b-8af4-f9b1bc590f21-kube-api-access-89jkx\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296961 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-service-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.296989 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.296969 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65b02b28-c15c-430b-8af4-f9b1bc590f21-oauth-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:16.794097 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794069 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b8d456b7d-qdr76_65b02b28-c15c-430b-8af4-f9b1bc590f21/console/0.log" Apr 17 11:34:16.794579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794109 2570 generic.go:358] "Generic (PLEG): container finished" podID="65b02b28-c15c-430b-8af4-f9b1bc590f21" containerID="e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc" exitCode=2 Apr 17 11:34:16.794579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794165 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8d456b7d-qdr76" event={"ID":"65b02b28-c15c-430b-8af4-f9b1bc590f21","Type":"ContainerDied","Data":"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc"} Apr 17 11:34:16.794579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794187 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8d456b7d-qdr76" event={"ID":"65b02b28-c15c-430b-8af4-f9b1bc590f21","Type":"ContainerDied","Data":"cd89ea404d2724d3141a490c3d4cfb743b7cb29e70da04d006293bba7af8c7b7"} Apr 17 11:34:16.794579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794193 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8d456b7d-qdr76" Apr 17 11:34:16.794579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.794203 2570 scope.go:117] "RemoveContainer" containerID="e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc" Apr 17 11:34:16.802626 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.802608 2570 scope.go:117] "RemoveContainer" containerID="e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc" Apr 17 11:34:16.802896 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:34:16.802876 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc\": container with ID starting with e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc not found: ID does not exist" containerID="e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc" Apr 17 11:34:16.802945 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.802905 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc"} err="failed to get container status \"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc\": rpc error: code = NotFound desc = could not find container \"e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc\": container with ID starting with e0670f98f58a2a91a4a38344d36703d1b76453ca8477a824e29d7f1e432004dc not found: ID does not exist" Apr 17 11:34:16.816386 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.816357 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:34:16.818999 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:16.818976 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b8d456b7d-qdr76"] Apr 17 11:34:17.092187 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:17.092158 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b02b28-c15c-430b-8af4-f9b1bc590f21" path="/var/lib/kubelet/pods/65b02b28-c15c-430b-8af4-f9b1bc590f21/volumes" Apr 17 11:34:27.613243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613207 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:27.613676 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613616 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="alertmanager" containerID="cri-o://ae399d5c0c103202a519bb999d8b5cc1edeaa5e799bc684f13f16f817c87c514" gracePeriod=120 Apr 17 11:34:27.613739 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613707 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-metric" containerID="cri-o://745bf27df552896d2c78943ecefcc530ef69e45e7d2f08cb0b9e345e83ecffe4" gracePeriod=120 Apr 17 11:34:27.613794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613707 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="prom-label-proxy" containerID="cri-o://f0224d1b309f20cbab064a9b1c96ade7925527da9f6be43917fc036c0e22f9db" gracePeriod=120 Apr 17 11:34:27.613794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613747 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy" containerID="cri-o://38682e33f208a85c9a688f92a1c16b001628df59f968720322515d417bb94db6" gracePeriod=120 Apr 17 11:34:27.613794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613744 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-web" containerID="cri-o://a77fa8737aa9f96d49595f691a7344c79a6cd41f3621b3d25def8236800485b9" gracePeriod=120 Apr 17 11:34:27.613794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.613728 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="config-reloader" containerID="cri-o://64f4c5ab3be57636c6e3e804d355a057b8ad60244a23a0289cf5eb0d2d5bda46" gracePeriod=120 Apr 17 11:34:27.827757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827724 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="f0224d1b309f20cbab064a9b1c96ade7925527da9f6be43917fc036c0e22f9db" exitCode=0 Apr 17 11:34:27.827757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827752 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="38682e33f208a85c9a688f92a1c16b001628df59f968720322515d417bb94db6" exitCode=0 Apr 17 11:34:27.827757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827758 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="64f4c5ab3be57636c6e3e804d355a057b8ad60244a23a0289cf5eb0d2d5bda46" exitCode=0 Apr 17 11:34:27.827757 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827764 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="ae399d5c0c103202a519bb999d8b5cc1edeaa5e799bc684f13f16f817c87c514" exitCode=0 Apr 17 11:34:27.828015 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827801 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"f0224d1b309f20cbab064a9b1c96ade7925527da9f6be43917fc036c0e22f9db"} Apr 17 11:34:27.828015 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"38682e33f208a85c9a688f92a1c16b001628df59f968720322515d417bb94db6"} Apr 17 11:34:27.828015 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827846 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"64f4c5ab3be57636c6e3e804d355a057b8ad60244a23a0289cf5eb0d2d5bda46"} Apr 17 11:34:27.828015 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:27.827856 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"ae399d5c0c103202a519bb999d8b5cc1edeaa5e799bc684f13f16f817c87c514"} Apr 17 11:34:28.834300 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.834272 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="745bf27df552896d2c78943ecefcc530ef69e45e7d2f08cb0b9e345e83ecffe4" exitCode=0 Apr 17 11:34:28.834300 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.834297 2570 generic.go:358] "Generic (PLEG): container finished" podID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerID="a77fa8737aa9f96d49595f691a7344c79a6cd41f3621b3d25def8236800485b9" exitCode=0 Apr 17 11:34:28.834652 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.834347 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"745bf27df552896d2c78943ecefcc530ef69e45e7d2f08cb0b9e345e83ecffe4"} Apr 17 11:34:28.834652 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.834386 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"a77fa8737aa9f96d49595f691a7344c79a6cd41f3621b3d25def8236800485b9"} Apr 17 11:34:28.849038 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.849017 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:28.902882 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902850 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902890 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902916 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902943 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdqlk\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902969 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.902996 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903065 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903022 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903081 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903121 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903169 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903195 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903228 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903278 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls\") pod \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\" (UID: \"8098d7c4-6d94-4d0e-95ae-67ca813d7557\") " Apr 17 11:34:28.903346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903307 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:28.903718 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903559 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-metrics-client-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:28.903718 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.903706 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:34:28.906019 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.905968 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.906313 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906280 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out" (OuterVolumeSpecName: "config-out") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:34:28.906642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906555 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.906642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906583 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.906642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906606 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:34:28.906642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906604 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.906642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.906635 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk" (OuterVolumeSpecName: "kube-api-access-kdqlk") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "kube-api-access-kdqlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:34:28.907276 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.907250 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:28.908298 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.908268 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume" (OuterVolumeSpecName: "config-volume") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.912623 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.912587 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:28.917661 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:28.917637 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config" (OuterVolumeSpecName: "web-config") pod "8098d7c4-6d94-4d0e-95ae-67ca813d7557" (UID: "8098d7c4-6d94-4d0e-95ae-67ca813d7557"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004598 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-main-tls\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004631 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004643 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-cluster-tls-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004652 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdqlk\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-kube-api-access-kdqlk\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004661 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-out\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004670 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-main-db\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004679 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-web-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004688 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8098d7c4-6d94-4d0e-95ae-67ca813d7557-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.004691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004698 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.005068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004708 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.005068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004716 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8098d7c4-6d94-4d0e-95ae-67ca813d7557-config-volume\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.005068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.004726 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8098d7c4-6d94-4d0e-95ae-67ca813d7557-tls-assets\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:34:29.839585 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.839539 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8098d7c4-6d94-4d0e-95ae-67ca813d7557","Type":"ContainerDied","Data":"9731d39f6d8468cf83d076c490cf88364bdc815cf8dbf149b04ee527ae07d062"} Apr 17 11:34:29.839585 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.839593 2570 scope.go:117] "RemoveContainer" containerID="f0224d1b309f20cbab064a9b1c96ade7925527da9f6be43917fc036c0e22f9db" Apr 17 11:34:29.840103 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.839620 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:29.846888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.846867 2570 scope.go:117] "RemoveContainer" containerID="745bf27df552896d2c78943ecefcc530ef69e45e7d2f08cb0b9e345e83ecffe4" Apr 17 11:34:29.855509 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.855489 2570 scope.go:117] "RemoveContainer" containerID="38682e33f208a85c9a688f92a1c16b001628df59f968720322515d417bb94db6" Apr 17 11:34:29.860466 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.860438 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:29.862722 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.862705 2570 scope.go:117] "RemoveContainer" containerID="a77fa8737aa9f96d49595f691a7344c79a6cd41f3621b3d25def8236800485b9" Apr 17 11:34:29.865085 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.865059 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:29.870004 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.869984 2570 scope.go:117] "RemoveContainer" containerID="64f4c5ab3be57636c6e3e804d355a057b8ad60244a23a0289cf5eb0d2d5bda46" Apr 17 11:34:29.877092 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.877072 2570 scope.go:117] "RemoveContainer" containerID="ae399d5c0c103202a519bb999d8b5cc1edeaa5e799bc684f13f16f817c87c514" Apr 17 11:34:29.884242 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.884219 2570 scope.go:117] "RemoveContainer" containerID="99d1e7e28e031f192caa1130d21c729444e9b3422002d2812f0dd9a4745c59f2" Apr 17 11:34:29.891291 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891268 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:29.891708 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891690 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="prom-label-proxy" Apr 17 11:34:29.891708 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891710 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="prom-label-proxy" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891724 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-web" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891733 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-web" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891747 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="alertmanager" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891756 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="alertmanager" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891773 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-metric" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891782 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-metric" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891799 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="config-reloader" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891808 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="config-reloader" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891819 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="init-config-reloader" Apr 17 11:34:29.891827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891826 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="init-config-reloader" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891840 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891849 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891859 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b02b28-c15c-430b-8af4-f9b1bc590f21" containerName="console" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891868 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b02b28-c15c-430b-8af4-f9b1bc590f21" containerName="console" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891929 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-metric" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891942 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891953 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="kube-rbac-proxy-web" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891965 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="config-reloader" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891976 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="alertmanager" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891985 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" containerName="prom-label-proxy" Apr 17 11:34:29.892124 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.891993 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b02b28-c15c-430b-8af4-f9b1bc590f21" containerName="console" Apr 17 11:34:29.897191 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.897174 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:29.899818 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.899793 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:34:29.899915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.899830 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:34:29.899915 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.899793 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:34:29.900046 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.899965 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:34:29.900181 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.900122 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:34:29.900181 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.900123 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:34:29.900335 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.900181 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:34:29.900685 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.900667 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:34:29.900837 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.900822 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8jts2\"" Apr 17 11:34:29.905301 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.905247 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:34:29.907362 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:29.907342 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:30.011579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011544 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-web-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011579 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbx9\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-kube-api-access-cwbx9\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011863 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.011924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.011891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-out\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.112633 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.112633 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.112873 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112659 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.112873 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112678 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.112873 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113014 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-out\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113014 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113014 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.112995 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113204 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113022 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-web-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113204 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113204 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbx9\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-kube-api-access-cwbx9\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113204 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113109 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113204 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.113874 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.113832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.114051 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.114027 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116054 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.115929 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116054 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.115934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-out\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116054 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.115976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116054 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.116047 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116300 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.116264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ededfcf7-3473-4054-94eb-c59a2a48cdf0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116590 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.116555 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-web-config\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.116709 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.116687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.117038 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.117019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.118147 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.118131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.118418 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.118403 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ededfcf7-3473-4054-94eb-c59a2a48cdf0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.125068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.125042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbx9\" (UniqueName: \"kubernetes.io/projected/ededfcf7-3473-4054-94eb-c59a2a48cdf0-kube-api-access-cwbx9\") pod \"alertmanager-main-0\" (UID: \"ededfcf7-3473-4054-94eb-c59a2a48cdf0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.207927 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.207889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:34:30.336257 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.336218 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:34:30.340019 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:34:30.339992 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podededfcf7_3473_4054_94eb_c59a2a48cdf0.slice/crio-c082d607d5a3fcde15a0bf2deda6dcb1055d22d6c14bfc4906f3dbaa4c571086 WatchSource:0}: Error finding container c082d607d5a3fcde15a0bf2deda6dcb1055d22d6c14bfc4906f3dbaa4c571086: Status 404 returned error can't find the container with id c082d607d5a3fcde15a0bf2deda6dcb1055d22d6c14bfc4906f3dbaa4c571086 Apr 17 11:34:30.844839 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.844807 2570 generic.go:358] "Generic (PLEG): container finished" podID="ededfcf7-3473-4054-94eb-c59a2a48cdf0" containerID="ffda3f2a0a91aa1ba9064ebedb5db873fa978a1358321d858dc88b2ee780234b" exitCode=0 Apr 17 11:34:30.845243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.844887 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerDied","Data":"ffda3f2a0a91aa1ba9064ebedb5db873fa978a1358321d858dc88b2ee780234b"} Apr 17 11:34:30.845243 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:30.844908 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"c082d607d5a3fcde15a0bf2deda6dcb1055d22d6c14bfc4906f3dbaa4c571086"} Apr 17 11:34:31.093595 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.093563 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8098d7c4-6d94-4d0e-95ae-67ca813d7557" path="/var/lib/kubelet/pods/8098d7c4-6d94-4d0e-95ae-67ca813d7557/volumes" Apr 17 11:34:31.625859 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.625816 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l"] Apr 17 11:34:31.629280 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.629256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.631672 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.631642 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j94gv\"" Apr 17 11:34:31.631821 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.631673 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 11:34:31.631821 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.631715 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 11:34:31.631821 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.631727 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 11:34:31.631821 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.631673 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 11:34:31.632170 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.632149 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 11:34:31.637532 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.637491 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 11:34:31.640319 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.640288 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l"] Apr 17 11:34:31.729006 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.728969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-serving-certs-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729170 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729170 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729052 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729170 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-federate-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729309 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729198 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pld9\" (UniqueName: \"kubernetes.io/projected/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-kube-api-access-7pld9\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729309 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729255 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729309 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.729409 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.729310 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-metrics-client-ca\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830288 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-serving-certs-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830288 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830498 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830498 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-federate-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830605 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pld9\" (UniqueName: \"kubernetes.io/projected/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-kube-api-access-7pld9\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830653 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830632 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830798 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.830927 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.830822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-metrics-client-ca\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.831112 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.831087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-serving-certs-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.831306 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.831286 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.831585 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.831562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-metrics-client-ca\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.833261 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.833235 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.833373 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.833274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-telemeter-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.833373 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.833309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-secret-telemeter-client\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.833373 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.833312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-federate-client-tls\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.842080 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.842054 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pld9\" (UniqueName: \"kubernetes.io/projected/1c40b491-1f4d-4dc0-8428-53bc257ebd9d-kube-api-access-7pld9\") pod \"telemeter-client-f4b8f8d4c-pqk8l\" (UID: \"1c40b491-1f4d-4dc0-8428-53bc257ebd9d\") " pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:31.851339 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851309 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"4b2b5a4e3c44c7e358eb63b8610aaccb3fcf63bd7147835e815f18f8b6e7075d"} Apr 17 11:34:31.851691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851345 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"f72937bd0bf84a4b93c751c7f112362a047722e13959b353490be63937b974d5"} Apr 17 11:34:31.851691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851359 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"7444e9966fa781fe327af56161c5749b466783334e25dae400fee22f5708000f"} Apr 17 11:34:31.851691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"adc22e1e9fd1ae045764b07534658f7bd10ee7e20ae2e2dc4f5b59db3cacfe3b"} Apr 17 11:34:31.851691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851377 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"a265da500cb5b75a2a828586c5d7add885cfc887a71b110789e916055d35b979"} Apr 17 11:34:31.851691 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.851384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ededfcf7-3473-4054-94eb-c59a2a48cdf0","Type":"ContainerStarted","Data":"e26a1effe009558c44f38de4497fb53ed01a53b87fcedb8b9d7bfdfc71bb6245"} Apr 17 11:34:31.882840 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.882738 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.882723887 podStartE2EDuration="2.882723887s" podCreationTimestamp="2026-04-17 11:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:34:31.880927985 +0000 UTC m=+245.409505233" watchObservedRunningTime="2026-04-17 11:34:31.882723887 +0000 UTC m=+245.411301135" Apr 17 11:34:31.940823 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:31.940786 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" Apr 17 11:34:32.068021 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:32.067984 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l"] Apr 17 11:34:32.072433 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:34:32.072389 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c40b491_1f4d_4dc0_8428_53bc257ebd9d.slice/crio-1a84863e165d762fc58bdeb01ccdef6b5a077a1dfdbacc9cf18812f5ee18d7f0 WatchSource:0}: Error finding container 1a84863e165d762fc58bdeb01ccdef6b5a077a1dfdbacc9cf18812f5ee18d7f0: Status 404 returned error can't find the container with id 1a84863e165d762fc58bdeb01ccdef6b5a077a1dfdbacc9cf18812f5ee18d7f0 Apr 17 11:34:32.857357 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:32.857315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" event={"ID":"1c40b491-1f4d-4dc0-8428-53bc257ebd9d","Type":"ContainerStarted","Data":"1a84863e165d762fc58bdeb01ccdef6b5a077a1dfdbacc9cf18812f5ee18d7f0"} Apr 17 11:34:34.865952 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:34.865917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" event={"ID":"1c40b491-1f4d-4dc0-8428-53bc257ebd9d","Type":"ContainerStarted","Data":"84a29f754d067ec6b8d92c6185e99ce3c31165fef8b72848ede9df324a2d3c5f"} Apr 17 11:34:34.865952 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:34.865954 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" event={"ID":"1c40b491-1f4d-4dc0-8428-53bc257ebd9d","Type":"ContainerStarted","Data":"e3f846804405f003149e3208a43c6b9718bb512e67de47f0537d9c0073fb6243"} Apr 17 11:34:34.866380 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:34.865966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" event={"ID":"1c40b491-1f4d-4dc0-8428-53bc257ebd9d","Type":"ContainerStarted","Data":"7e1550a99a00fdd9836e61df173605845571eda59bc17dd46a29e4b701d02b14"} Apr 17 11:34:34.888657 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:34.888605 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f4b8f8d4c-pqk8l" podStartSLOduration=1.923296545 podStartE2EDuration="3.888588355s" podCreationTimestamp="2026-04-17 11:34:31 +0000 UTC" firstStartedPulling="2026-04-17 11:34:32.074270878 +0000 UTC m=+245.602848105" lastFinishedPulling="2026-04-17 11:34:34.039562685 +0000 UTC m=+247.568139915" observedRunningTime="2026-04-17 11:34:34.886592546 +0000 UTC m=+248.415169795" watchObservedRunningTime="2026-04-17 11:34:34.888588355 +0000 UTC m=+248.417165605" Apr 17 11:34:35.567051 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.567015 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:34:35.572033 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.571996 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.581706 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.581674 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:34:35.668067 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668266 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668266 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668167 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668266 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668215 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668305 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdf9x\" (UniqueName: \"kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.668388 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.668357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769071 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769071 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdf9x\" (UniqueName: \"kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769358 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.769955 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769931 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.770053 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.769975 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.770141 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.770120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.770190 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.770160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.771672 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.771651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.771770 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.771752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.776647 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.776626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdf9x\" (UniqueName: \"kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x\") pod \"console-5f95cd9564-g9vfc\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:35.883416 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:35.883307 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:36.005441 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:36.005406 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:34:36.009396 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:34:36.009361 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024f9aae_e4bf_4250_adc9_eab2d234ab19.slice/crio-c5be2f2c9780ce287e3a727109b9a0fe68b946f517f7f2a68f7255e4d5a3531c WatchSource:0}: Error finding container c5be2f2c9780ce287e3a727109b9a0fe68b946f517f7f2a68f7255e4d5a3531c: Status 404 returned error can't find the container with id c5be2f2c9780ce287e3a727109b9a0fe68b946f517f7f2a68f7255e4d5a3531c Apr 17 11:34:36.873070 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:36.873032 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f95cd9564-g9vfc" event={"ID":"024f9aae-e4bf-4250-adc9-eab2d234ab19","Type":"ContainerStarted","Data":"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e"} Apr 17 11:34:36.873070 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:36.873068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f95cd9564-g9vfc" event={"ID":"024f9aae-e4bf-4250-adc9-eab2d234ab19","Type":"ContainerStarted","Data":"c5be2f2c9780ce287e3a727109b9a0fe68b946f517f7f2a68f7255e4d5a3531c"} Apr 17 11:34:36.907588 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:36.907505 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f95cd9564-g9vfc" podStartSLOduration=1.907490316 podStartE2EDuration="1.907490316s" podCreationTimestamp="2026-04-17 11:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:34:36.906500626 +0000 UTC m=+250.435077875" watchObservedRunningTime="2026-04-17 11:34:36.907490316 +0000 UTC m=+250.436067564" Apr 17 11:34:38.999814 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:38.999769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:34:39.002105 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:39.002082 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343340da-6202-4b41-8b3d-4e0c0f72ecb6-metrics-certs\") pod \"network-metrics-daemon-z52nx\" (UID: \"343340da-6202-4b41-8b3d-4e0c0f72ecb6\") " pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:34:39.292683 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:39.292603 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lzg8w\"" Apr 17 11:34:39.300469 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:39.300444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z52nx" Apr 17 11:34:39.428836 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:39.428802 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z52nx"] Apr 17 11:34:39.432707 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:34:39.432680 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343340da_6202_4b41_8b3d_4e0c0f72ecb6.slice/crio-3d8fc59a4285c28e85f9f1b471b81ad13f12dff245c70f4d25d09da8f8e2c96e WatchSource:0}: Error finding container 3d8fc59a4285c28e85f9f1b471b81ad13f12dff245c70f4d25d09da8f8e2c96e: Status 404 returned error can't find the container with id 3d8fc59a4285c28e85f9f1b471b81ad13f12dff245c70f4d25d09da8f8e2c96e Apr 17 11:34:39.883784 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:39.883749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z52nx" event={"ID":"343340da-6202-4b41-8b3d-4e0c0f72ecb6","Type":"ContainerStarted","Data":"3d8fc59a4285c28e85f9f1b471b81ad13f12dff245c70f4d25d09da8f8e2c96e"} Apr 17 11:34:40.890929 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:40.890892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z52nx" event={"ID":"343340da-6202-4b41-8b3d-4e0c0f72ecb6","Type":"ContainerStarted","Data":"daebed947d6d143d9e140968ee0672fa3d02cfb740ff000aab068d6a1caaa50a"} Apr 17 11:34:40.890929 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:40.890929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z52nx" event={"ID":"343340da-6202-4b41-8b3d-4e0c0f72ecb6","Type":"ContainerStarted","Data":"8c5e09b3aa4a578f49659d0bd65041a62e3404d30727bff54357e66b38f03281"} Apr 17 11:34:40.908021 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:40.907959 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z52nx" podStartSLOduration=253.015010885 podStartE2EDuration="4m13.907940767s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:34:39.434610129 +0000 UTC m=+252.963187356" lastFinishedPulling="2026-04-17 11:34:40.327540011 +0000 UTC m=+253.856117238" observedRunningTime="2026-04-17 11:34:40.907164495 +0000 UTC m=+254.435741747" watchObservedRunningTime="2026-04-17 11:34:40.907940767 +0000 UTC m=+254.436518018" Apr 17 11:34:45.883873 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:45.883781 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:45.883873 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:45.883829 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:45.888503 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:45.888477 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:45.910860 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:45.910833 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:34:45.959346 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:34:45.959311 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:35:10.980061 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:10.980002 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77b4fdfbf6-25lvm" podUID="edda8085-a282-45d9-98c9-ca1700d5801d" containerName="console" containerID="cri-o://b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3" gracePeriod=15 Apr 17 11:35:11.215682 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.215662 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77b4fdfbf6-25lvm_edda8085-a282-45d9-98c9-ca1700d5801d/console/0.log" Apr 17 11:35:11.215797 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.215719 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:35:11.269680 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269587 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269680 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269642 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4snx\" (UniqueName: \"kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269680 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269673 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269956 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269694 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269956 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269714 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269956 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269751 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.269956 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.269781 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert\") pod \"edda8085-a282-45d9-98c9-ca1700d5801d\" (UID: \"edda8085-a282-45d9-98c9-ca1700d5801d\") " Apr 17 11:35:11.270142 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.270075 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:35:11.270181 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.270157 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config" (OuterVolumeSpecName: "console-config") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:35:11.270240 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.270169 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca" (OuterVolumeSpecName: "service-ca") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:35:11.270240 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.270172 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:35:11.271962 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.271932 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:35:11.271962 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.271953 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:35:11.272108 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.271963 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx" (OuterVolumeSpecName: "kube-api-access-g4snx") pod "edda8085-a282-45d9-98c9-ca1700d5801d" (UID: "edda8085-a282-45d9-98c9-ca1700d5801d"). InnerVolumeSpecName "kube-api-access-g4snx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:35:11.370568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370492 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370564 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-oauth-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370574 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4snx\" (UniqueName: \"kubernetes.io/projected/edda8085-a282-45d9-98c9-ca1700d5801d-kube-api-access-g4snx\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370568 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370583 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-trusted-ca-bundle\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370840 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370592 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-console-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370840 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370600 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/edda8085-a282-45d9-98c9-ca1700d5801d-service-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.370840 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.370608 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/edda8085-a282-45d9-98c9-ca1700d5801d-console-oauth-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:35:11.987153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987126 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77b4fdfbf6-25lvm_edda8085-a282-45d9-98c9-ca1700d5801d/console/0.log" Apr 17 11:35:11.987671 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987175 2570 generic.go:358] "Generic (PLEG): container finished" podID="edda8085-a282-45d9-98c9-ca1700d5801d" containerID="b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3" exitCode=2 Apr 17 11:35:11.987671 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b4fdfbf6-25lvm" event={"ID":"edda8085-a282-45d9-98c9-ca1700d5801d","Type":"ContainerDied","Data":"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3"} Apr 17 11:35:11.987671 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987239 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b4fdfbf6-25lvm" event={"ID":"edda8085-a282-45d9-98c9-ca1700d5801d","Type":"ContainerDied","Data":"9df2ad5b814be520ff8a3d6aef5c9b54590444c45e36e3c4d4058001ec241337"} Apr 17 11:35:11.987671 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987252 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b4fdfbf6-25lvm" Apr 17 11:35:11.987671 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.987258 2570 scope.go:117] "RemoveContainer" containerID="b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3" Apr 17 11:35:11.995619 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.995594 2570 scope.go:117] "RemoveContainer" containerID="b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3" Apr 17 11:35:11.995880 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:35:11.995862 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3\": container with ID starting with b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3 not found: ID does not exist" containerID="b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3" Apr 17 11:35:11.995937 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:11.995888 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3"} err="failed to get container status \"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3\": rpc error: code = NotFound desc = could not find container \"b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3\": container with ID starting with b7cf333b5309640f9689023d7e6187e708f99a89971ba61b0da9b5b60b81f5d3 not found: ID does not exist" Apr 17 11:35:12.008552 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:12.008495 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:35:12.011342 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:12.011317 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77b4fdfbf6-25lvm"] Apr 17 11:35:13.091683 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:13.091644 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edda8085-a282-45d9-98c9-ca1700d5801d" path="/var/lib/kubelet/pods/edda8085-a282-45d9-98c9-ca1700d5801d/volumes" Apr 17 11:35:26.972292 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:26.972268 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:35:43.428565 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.428536 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:35:43.429826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.428835 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edda8085-a282-45d9-98c9-ca1700d5801d" containerName="console" Apr 17 11:35:43.429826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.428845 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="edda8085-a282-45d9-98c9-ca1700d5801d" containerName="console" Apr 17 11:35:43.429826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.428917 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="edda8085-a282-45d9-98c9-ca1700d5801d" containerName="console" Apr 17 11:35:43.430580 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.430565 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.442350 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.442322 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:35:43.555998 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.555964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.555998 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.556214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.556214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mhn\" (UniqueName: \"kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.556214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.556214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.556214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.556151 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657120 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657120 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657329 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657329 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mhn\" (UniqueName: \"kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657329 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657445 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657563 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.657954 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.657929 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.658074 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.658048 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.658176 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.658132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.658404 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.658380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.659632 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.659610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.659794 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.659773 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.664153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.664134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mhn\" (UniqueName: \"kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn\") pod \"console-f97688f88-6hblw\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.740223 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.740121 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:43.862185 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.862140 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:35:43.864439 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:35:43.864399 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ecd67b0_d955_42bb_a311_f15d06d5915b.slice/crio-d59f187c65c6c82d5613d5b1f3377aec898ab1caa3c0b4d064eba7b38bfacd2c WatchSource:0}: Error finding container d59f187c65c6c82d5613d5b1f3377aec898ab1caa3c0b4d064eba7b38bfacd2c: Status 404 returned error can't find the container with id d59f187c65c6c82d5613d5b1f3377aec898ab1caa3c0b4d064eba7b38bfacd2c Apr 17 11:35:43.866115 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:43.866097 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:35:44.082985 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:44.082951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f97688f88-6hblw" event={"ID":"4ecd67b0-d955-42bb-a311-f15d06d5915b","Type":"ContainerStarted","Data":"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46"} Apr 17 11:35:44.083148 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:44.082992 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f97688f88-6hblw" event={"ID":"4ecd67b0-d955-42bb-a311-f15d06d5915b","Type":"ContainerStarted","Data":"d59f187c65c6c82d5613d5b1f3377aec898ab1caa3c0b4d064eba7b38bfacd2c"} Apr 17 11:35:44.100372 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:44.100323 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f97688f88-6hblw" podStartSLOduration=1.1003081 podStartE2EDuration="1.1003081s" podCreationTimestamp="2026-04-17 11:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:35:44.098810523 +0000 UTC m=+317.627387771" watchObservedRunningTime="2026-04-17 11:35:44.1003081 +0000 UTC m=+317.628885348" Apr 17 11:35:53.740849 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:53.740794 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:53.740849 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:53.740845 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:53.745976 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:53.745950 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:54.115214 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:54.115187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:35:54.162938 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:35:54.162905 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:36:12.215078 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.215049 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qhzzb"] Apr 17 11:36:12.218367 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.218342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.220603 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.220575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:36:12.225397 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.225374 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhzzb"] Apr 17 11:36:12.292963 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.292927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ec16b0-6a48-4435-994b-37438778f1f0-original-pull-secret\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.293166 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.292974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-dbus\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.293166 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.293063 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-kubelet-config\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.394539 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.394478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-kubelet-config\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.394719 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.394586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ec16b0-6a48-4435-994b-37438778f1f0-original-pull-secret\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.394719 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.394606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-kubelet-config\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.394719 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.394641 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-dbus\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.394872 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.394816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ec16b0-6a48-4435-994b-37438778f1f0-dbus\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.396852 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.396822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ec16b0-6a48-4435-994b-37438778f1f0-original-pull-secret\") pod \"global-pull-secret-syncer-qhzzb\" (UID: \"93ec16b0-6a48-4435-994b-37438778f1f0\") " pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.529349 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.529231 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhzzb" Apr 17 11:36:12.647825 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:12.647799 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhzzb"] Apr 17 11:36:12.650456 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:36:12.650413 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ec16b0_6a48_4435_994b_37438778f1f0.slice/crio-f390469c71754778f9a0a6f909efd5b40b82f3007371d014c6fbd01a8539ddc3 WatchSource:0}: Error finding container f390469c71754778f9a0a6f909efd5b40b82f3007371d014c6fbd01a8539ddc3: Status 404 returned error can't find the container with id f390469c71754778f9a0a6f909efd5b40b82f3007371d014c6fbd01a8539ddc3 Apr 17 11:36:13.166737 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:13.166697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhzzb" event={"ID":"93ec16b0-6a48-4435-994b-37438778f1f0","Type":"ContainerStarted","Data":"f390469c71754778f9a0a6f909efd5b40b82f3007371d014c6fbd01a8539ddc3"} Apr 17 11:36:17.180607 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:17.180570 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhzzb" event={"ID":"93ec16b0-6a48-4435-994b-37438778f1f0","Type":"ContainerStarted","Data":"a1506a5da7c146ff281837fc92fb874d856ef42e60ef7bf0ba79aa5620ab7755"} Apr 17 11:36:17.196995 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:17.196947 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qhzzb" podStartSLOduration=1.39831709 podStartE2EDuration="5.1969322s" podCreationTimestamp="2026-04-17 11:36:12 +0000 UTC" firstStartedPulling="2026-04-17 11:36:12.652011578 +0000 UTC m=+346.180588805" lastFinishedPulling="2026-04-17 11:36:16.450626689 +0000 UTC m=+349.979203915" observedRunningTime="2026-04-17 11:36:17.195953273 +0000 UTC m=+350.724530516" watchObservedRunningTime="2026-04-17 11:36:17.1969322 +0000 UTC m=+350.725509510" Apr 17 11:36:19.183295 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.183251 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f95cd9564-g9vfc" podUID="024f9aae-e4bf-4250-adc9-eab2d234ab19" containerName="console" containerID="cri-o://341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e" gracePeriod=15 Apr 17 11:36:19.418231 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.418210 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f95cd9564-g9vfc_024f9aae-e4bf-4250-adc9-eab2d234ab19/console/0.log" Apr 17 11:36:19.418355 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.418269 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:36:19.558813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558715 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.558813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558782 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559034 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558842 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdf9x\" (UniqueName: \"kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559034 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558893 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559034 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558919 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559034 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558943 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559034 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.558983 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config\") pod \"024f9aae-e4bf-4250-adc9-eab2d234ab19\" (UID: \"024f9aae-e4bf-4250-adc9-eab2d234ab19\") " Apr 17 11:36:19.559272 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559134 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:36:19.559336 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559316 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config" (OuterVolumeSpecName: "console-config") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:36:19.559393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559335 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:36:19.559393 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559376 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca" (OuterVolumeSpecName: "service-ca") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:36:19.559504 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559421 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-trusted-ca-bundle\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.559504 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559444 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.559504 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.559456 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-oauth-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.561310 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.561269 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:36:19.561547 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.561503 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x" (OuterVolumeSpecName: "kube-api-access-pdf9x") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "kube-api-access-pdf9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:36:19.561547 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.561510 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "024f9aae-e4bf-4250-adc9-eab2d234ab19" (UID: "024f9aae-e4bf-4250-adc9-eab2d234ab19"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:36:19.660704 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.660653 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdf9x\" (UniqueName: \"kubernetes.io/projected/024f9aae-e4bf-4250-adc9-eab2d234ab19-kube-api-access-pdf9x\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.660704 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.660696 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.660704 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.660707 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/024f9aae-e4bf-4250-adc9-eab2d234ab19-service-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:19.660704 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:19.660717 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/024f9aae-e4bf-4250-adc9-eab2d234ab19-console-oauth-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:36:20.191162 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191127 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f95cd9564-g9vfc_024f9aae-e4bf-4250-adc9-eab2d234ab19/console/0.log" Apr 17 11:36:20.191575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191174 2570 generic.go:358] "Generic (PLEG): container finished" podID="024f9aae-e4bf-4250-adc9-eab2d234ab19" containerID="341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e" exitCode=2 Apr 17 11:36:20.191575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f95cd9564-g9vfc" event={"ID":"024f9aae-e4bf-4250-adc9-eab2d234ab19","Type":"ContainerDied","Data":"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e"} Apr 17 11:36:20.191575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191236 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f95cd9564-g9vfc" Apr 17 11:36:20.191575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191254 2570 scope.go:117] "RemoveContainer" containerID="341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e" Apr 17 11:36:20.191575 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.191239 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f95cd9564-g9vfc" event={"ID":"024f9aae-e4bf-4250-adc9-eab2d234ab19","Type":"ContainerDied","Data":"c5be2f2c9780ce287e3a727109b9a0fe68b946f517f7f2a68f7255e4d5a3531c"} Apr 17 11:36:20.200043 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.200026 2570 scope.go:117] "RemoveContainer" containerID="341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e" Apr 17 11:36:20.200272 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:36:20.200255 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e\": container with ID starting with 341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e not found: ID does not exist" containerID="341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e" Apr 17 11:36:20.200319 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.200281 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e"} err="failed to get container status \"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e\": rpc error: code = NotFound desc = could not find container \"341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e\": container with ID starting with 341526bb4eb6f89a6ff072bb5486035cd93adddbafc104d10435bef71c95817e not found: ID does not exist" Apr 17 11:36:20.212400 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.212372 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:36:20.216104 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:20.216082 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f95cd9564-g9vfc"] Apr 17 11:36:21.091922 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:36:21.091886 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024f9aae-e4bf-4250-adc9-eab2d234ab19" path="/var/lib/kubelet/pods/024f9aae-e4bf-4250-adc9-eab2d234ab19/volumes" Apr 17 11:37:04.890692 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.890657 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fgt5v"] Apr 17 11:37:04.891107 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.890978 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="024f9aae-e4bf-4250-adc9-eab2d234ab19" containerName="console" Apr 17 11:37:04.891107 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.890990 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="024f9aae-e4bf-4250-adc9-eab2d234ab19" containerName="console" Apr 17 11:37:04.891107 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.891054 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="024f9aae-e4bf-4250-adc9-eab2d234ab19" containerName="console" Apr 17 11:37:04.892878 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.892862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:04.894924 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.894899 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 11:37:04.895755 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.895730 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-mk4vq\"" Apr 17 11:37:04.895755 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.895751 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 11:37:04.903270 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.903244 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fgt5v"] Apr 17 11:37:04.934115 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.934084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:04.934271 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:04.934139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgkt\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-kube-api-access-5rgkt\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.035395 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.035360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgkt\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-kube-api-access-5rgkt\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.035601 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.035472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.042927 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.042895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.043218 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.043194 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgkt\" (UniqueName: \"kubernetes.io/projected/80c7e422-cc7c-4b94-ae18-c77c0de6e39e-kube-api-access-5rgkt\") pod \"cert-manager-webhook-597b96b99b-fgt5v\" (UID: \"80c7e422-cc7c-4b94-ae18-c77c0de6e39e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.212167 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.212081 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:05.336104 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:05.336080 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fgt5v"] Apr 17 11:37:05.338331 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:37:05.338305 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80c7e422_cc7c_4b94_ae18_c77c0de6e39e.slice/crio-48be6036fa5d5ef4419f6bec3168f761ec6d4704306b16c424026ae9eff377fd WatchSource:0}: Error finding container 48be6036fa5d5ef4419f6bec3168f761ec6d4704306b16c424026ae9eff377fd: Status 404 returned error can't find the container with id 48be6036fa5d5ef4419f6bec3168f761ec6d4704306b16c424026ae9eff377fd Apr 17 11:37:06.328612 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:06.328566 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" event={"ID":"80c7e422-cc7c-4b94-ae18-c77c0de6e39e","Type":"ContainerStarted","Data":"48be6036fa5d5ef4419f6bec3168f761ec6d4704306b16c424026ae9eff377fd"} Apr 17 11:37:09.344602 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:09.344569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" event={"ID":"80c7e422-cc7c-4b94-ae18-c77c0de6e39e","Type":"ContainerStarted","Data":"624e6a3edbb48bfed5ce0fca675557b945a3fb81ba4ac43c00e5ad2e3d27b1ad"} Apr 17 11:37:09.345058 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:09.344685 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:09.363826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:09.363779 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" podStartSLOduration=1.967790125 podStartE2EDuration="5.363767254s" podCreationTimestamp="2026-04-17 11:37:04 +0000 UTC" firstStartedPulling="2026-04-17 11:37:05.340178122 +0000 UTC m=+398.868755354" lastFinishedPulling="2026-04-17 11:37:08.736155251 +0000 UTC m=+402.264732483" observedRunningTime="2026-04-17 11:37:09.362692663 +0000 UTC m=+402.891269912" watchObservedRunningTime="2026-04-17 11:37:09.363767254 +0000 UTC m=+402.892344503" Apr 17 11:37:15.350980 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:15.350949 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fgt5v" Apr 17 11:37:51.317727 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.317650 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799"] Apr 17 11:37:51.320938 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.320916 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.324271 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324251 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 11:37:51.324271 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324265 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-4t465\"" Apr 17 11:37:51.324465 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324296 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 11:37:51.324465 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324265 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 11:37:51.324465 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324264 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 11:37:51.324465 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.324266 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:37:51.335822 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.335795 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799"] Apr 17 11:37:51.420971 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.420927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djf79\" (UniqueName: \"kubernetes.io/projected/598d52f0-87d0-41a0-a0c6-e186bec19773-kube-api-access-djf79\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.421159 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.421013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-metrics-certs\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.421159 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.421044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/598d52f0-87d0-41a0-a0c6-e186bec19773-manager-config\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.421159 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.421067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-cert\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.522153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.522097 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-metrics-certs\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.522153 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.522161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/598d52f0-87d0-41a0-a0c6-e186bec19773-manager-config\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.522383 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.522182 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-cert\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.522383 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.522225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djf79\" (UniqueName: \"kubernetes.io/projected/598d52f0-87d0-41a0-a0c6-e186bec19773-kube-api-access-djf79\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.522877 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.522846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/598d52f0-87d0-41a0-a0c6-e186bec19773-manager-config\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.524669 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.524641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-cert\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.524771 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.524716 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/598d52f0-87d0-41a0-a0c6-e186bec19773-metrics-certs\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.529236 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.529210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djf79\" (UniqueName: \"kubernetes.io/projected/598d52f0-87d0-41a0-a0c6-e186bec19773-kube-api-access-djf79\") pod \"jobset-controller-manager-75b9769c54-tl799\" (UID: \"598d52f0-87d0-41a0-a0c6-e186bec19773\") " pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.632315 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.632231 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:51.750890 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:51.750861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799"] Apr 17 11:37:51.753170 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:37:51.753142 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598d52f0_87d0_41a0_a0c6_e186bec19773.slice/crio-131225b5e8b43b7a6cbd05db46904a2840a5f49e2675ff3611f6f6d42bf71aae WatchSource:0}: Error finding container 131225b5e8b43b7a6cbd05db46904a2840a5f49e2675ff3611f6f6d42bf71aae: Status 404 returned error can't find the container with id 131225b5e8b43b7a6cbd05db46904a2840a5f49e2675ff3611f6f6d42bf71aae Apr 17 11:37:52.475547 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:52.475494 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" event={"ID":"598d52f0-87d0-41a0-a0c6-e186bec19773","Type":"ContainerStarted","Data":"131225b5e8b43b7a6cbd05db46904a2840a5f49e2675ff3611f6f6d42bf71aae"} Apr 17 11:37:54.483330 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:54.483295 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" event={"ID":"598d52f0-87d0-41a0-a0c6-e186bec19773","Type":"ContainerStarted","Data":"b933ac3cd7e3c76b561922856db03f752f35ff43109f75759b3510db4eec70ca"} Apr 17 11:37:54.483730 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:54.483364 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:37:54.500452 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:37:54.500382 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" podStartSLOduration=1.340794951 podStartE2EDuration="3.500366626s" podCreationTimestamp="2026-04-17 11:37:51 +0000 UTC" firstStartedPulling="2026-04-17 11:37:51.755130949 +0000 UTC m=+445.283708177" lastFinishedPulling="2026-04-17 11:37:53.914702626 +0000 UTC m=+447.443279852" observedRunningTime="2026-04-17 11:37:54.499407957 +0000 UTC m=+448.027985207" watchObservedRunningTime="2026-04-17 11:37:54.500366626 +0000 UTC m=+448.028943874" Apr 17 11:38:05.495149 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:38:05.495114 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-75b9769c54-tl799" Apr 17 11:40:11.533907 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.533865 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-87d7cfd76-k5sx4"] Apr 17 11:40:11.537303 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.537275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.544971 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.544944 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87d7cfd76-k5sx4"] Apr 17 11:40:11.703666 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-oauth-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703666 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703892 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-service-ca\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703892 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703721 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-trusted-ca-bundle\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703892 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703746 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-oauth-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703892 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt97r\" (UniqueName: \"kubernetes.io/projected/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-kube-api-access-tt97r\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.703892 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.703830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.804928 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.804830 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-trusted-ca-bundle\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.804928 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.804876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-oauth-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.804928 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.804924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt97r\" (UniqueName: \"kubernetes.io/projected/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-kube-api-access-tt97r\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805211 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.804961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805211 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-oauth-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805211 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805211 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-service-ca\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805832 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-trusted-ca-bundle\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805832 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-oauth-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805999 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-service-ca\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.805999 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.805848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.807682 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.807650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-oauth-config\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.807814 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.807793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-console-serving-cert\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.813183 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.813162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt97r\" (UniqueName: \"kubernetes.io/projected/ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0-kube-api-access-tt97r\") pod \"console-87d7cfd76-k5sx4\" (UID: \"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0\") " pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.847377 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.847347 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:11.967193 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:11.967126 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87d7cfd76-k5sx4"] Apr 17 11:40:11.969968 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:40:11.969938 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2ec323_b2b0_47b9_9107_3a5d8d8f30d0.slice/crio-80ce0f2860efa97e87193c337d9d3d47344095e63deb16aae8897ff965ea8a45 WatchSource:0}: Error finding container 80ce0f2860efa97e87193c337d9d3d47344095e63deb16aae8897ff965ea8a45: Status 404 returned error can't find the container with id 80ce0f2860efa97e87193c337d9d3d47344095e63deb16aae8897ff965ea8a45 Apr 17 11:40:12.905619 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:12.905580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87d7cfd76-k5sx4" event={"ID":"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0","Type":"ContainerStarted","Data":"e252bb49a52908439b763bf98fbf1f4dac835820f39a1dc2184bcc80643de2d8"} Apr 17 11:40:12.905619 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:12.905619 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87d7cfd76-k5sx4" event={"ID":"ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0","Type":"ContainerStarted","Data":"80ce0f2860efa97e87193c337d9d3d47344095e63deb16aae8897ff965ea8a45"} Apr 17 11:40:12.921972 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:12.921924 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-87d7cfd76-k5sx4" podStartSLOduration=1.92190977 podStartE2EDuration="1.92190977s" podCreationTimestamp="2026-04-17 11:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:40:12.920599474 +0000 UTC m=+586.449176723" watchObservedRunningTime="2026-04-17 11:40:12.92190977 +0000 UTC m=+586.450487019" Apr 17 11:40:21.847543 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:21.847481 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:21.847950 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:21.847557 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:21.852383 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:21.852360 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:21.936144 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:21.936114 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-87d7cfd76-k5sx4" Apr 17 11:40:21.983455 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:21.983420 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:40:47.003177 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.003115 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f97688f88-6hblw" podUID="4ecd67b0-d955-42bb-a311-f15d06d5915b" containerName="console" containerID="cri-o://a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46" gracePeriod=15 Apr 17 11:40:47.252304 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.252278 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f97688f88-6hblw_4ecd67b0-d955-42bb-a311-f15d06d5915b/console/0.log" Apr 17 11:40:47.252442 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.252339 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:40:47.326289 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326254 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326295 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326350 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326384 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326413 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4mhn\" (UniqueName: \"kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326699 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326460 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.326699 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326486 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle\") pod \"4ecd67b0-d955-42bb-a311-f15d06d5915b\" (UID: \"4ecd67b0-d955-42bb-a311-f15d06d5915b\") " Apr 17 11:40:47.327053 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326844 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config" (OuterVolumeSpecName: "console-config") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:40:47.327053 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326905 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca" (OuterVolumeSpecName: "service-ca") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:40:47.327053 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.326940 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:40:47.327053 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.327024 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:40:47.328905 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.328875 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:40:47.329040 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.328926 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn" (OuterVolumeSpecName: "kube-api-access-c4mhn") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "kube-api-access-c4mhn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:40:47.329040 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.328952 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4ecd67b0-d955-42bb-a311-f15d06d5915b" (UID: "4ecd67b0-d955-42bb-a311-f15d06d5915b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:40:47.427737 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427701 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427737 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427730 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-service-ca\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427737 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427739 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427961 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427751 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ecd67b0-d955-42bb-a311-f15d06d5915b-console-oauth-config\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427961 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427760 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4mhn\" (UniqueName: \"kubernetes.io/projected/4ecd67b0-d955-42bb-a311-f15d06d5915b-kube-api-access-c4mhn\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427961 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427768 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-oauth-serving-cert\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:47.427961 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:47.427777 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecd67b0-d955-42bb-a311-f15d06d5915b-trusted-ca-bundle\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:40:48.010378 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010351 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f97688f88-6hblw_4ecd67b0-d955-42bb-a311-f15d06d5915b/console/0.log" Apr 17 11:40:48.010813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010392 2570 generic.go:358] "Generic (PLEG): container finished" podID="4ecd67b0-d955-42bb-a311-f15d06d5915b" containerID="a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46" exitCode=2 Apr 17 11:40:48.010813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010462 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f97688f88-6hblw" Apr 17 11:40:48.010813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010479 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f97688f88-6hblw" event={"ID":"4ecd67b0-d955-42bb-a311-f15d06d5915b","Type":"ContainerDied","Data":"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46"} Apr 17 11:40:48.010813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f97688f88-6hblw" event={"ID":"4ecd67b0-d955-42bb-a311-f15d06d5915b","Type":"ContainerDied","Data":"d59f187c65c6c82d5613d5b1f3377aec898ab1caa3c0b4d064eba7b38bfacd2c"} Apr 17 11:40:48.010813 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.010554 2570 scope.go:117] "RemoveContainer" containerID="a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46" Apr 17 11:40:48.019726 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.019700 2570 scope.go:117] "RemoveContainer" containerID="a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46" Apr 17 11:40:48.020094 ip-10-0-130-210 kubenswrapper[2570]: E0417 11:40:48.020069 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46\": container with ID starting with a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46 not found: ID does not exist" containerID="a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46" Apr 17 11:40:48.020170 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.020107 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46"} err="failed to get container status \"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46\": rpc error: code = NotFound desc = could not find container \"a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46\": container with ID starting with a1ecf0ee324b05e890555c04cd344d6678e69b399c5cc9cb02258409f0244a46 not found: ID does not exist" Apr 17 11:40:48.038628 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.038588 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:40:48.043270 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:48.043238 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f97688f88-6hblw"] Apr 17 11:40:49.091851 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:40:49.091804 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ecd67b0-d955-42bb-a311-f15d06d5915b" path="/var/lib/kubelet/pods/4ecd67b0-d955-42bb-a311-f15d06d5915b/volumes" Apr 17 11:47:35.823563 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.823504 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf"] Apr 17 11:47:35.826017 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.823966 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ecd67b0-d955-42bb-a311-f15d06d5915b" containerName="console" Apr 17 11:47:35.826017 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.823980 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd67b0-d955-42bb-a311-f15d06d5915b" containerName="console" Apr 17 11:47:35.826017 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.824048 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ecd67b0-d955-42bb-a311-f15d06d5915b" containerName="console" Apr 17 11:47:35.827037 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.827018 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:47:35.829303 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.829281 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"openshift-service-ca.crt\"" Apr 17 11:47:35.829303 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.829300 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"kube-root-ca.crt\"" Apr 17 11:47:35.829911 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.829896 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-hfmts\"/\"default-dockercfg-h6m6b\"" Apr 17 11:47:35.835699 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.835672 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf"] Apr 17 11:47:35.879135 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.879092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzlc\" (UniqueName: \"kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc\") pod \"progression-min-interval-node-0-0-zbmbf\" (UID: \"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48\") " pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:47:35.979888 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.979844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzlc\" (UniqueName: \"kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc\") pod \"progression-min-interval-node-0-0-zbmbf\" (UID: \"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48\") " pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:47:35.988353 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:35.988329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzlc\" (UniqueName: \"kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc\") pod \"progression-min-interval-node-0-0-zbmbf\" (UID: \"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48\") " pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:47:36.138200 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:36.138123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:47:36.261374 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:36.261342 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf"] Apr 17 11:47:36.264147 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:47:36.264118 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48db52a5_2a5d_4fb2_94f3_a3a5ad176e48.slice/crio-9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01 WatchSource:0}: Error finding container 9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01: Status 404 returned error can't find the container with id 9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01 Apr 17 11:47:36.266759 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:36.266741 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:47:37.256717 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:47:37.256661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" event={"ID":"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48","Type":"ContainerStarted","Data":"9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01"} Apr 17 11:49:19.595488 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:19.595446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" event={"ID":"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48","Type":"ContainerStarted","Data":"ca474316f204e5febe1e98cca9bc7a570d78d790514a315d469fa0bc21bcf91a"} Apr 17 11:49:19.595991 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:19.595506 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:49:19.622912 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:19.622859 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" podStartSLOduration=2.299480324 podStartE2EDuration="1m44.62284182s" podCreationTimestamp="2026-04-17 11:47:35 +0000 UTC" firstStartedPulling="2026-04-17 11:47:36.266867186 +0000 UTC m=+1029.795444413" lastFinishedPulling="2026-04-17 11:49:18.590228679 +0000 UTC m=+1132.118805909" observedRunningTime="2026-04-17 11:49:19.620813641 +0000 UTC m=+1133.149390890" watchObservedRunningTime="2026-04-17 11:49:19.62284182 +0000 UTC m=+1133.151419069" Apr 17 11:49:20.597936 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:20.597904 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:49:41.880057 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:41.880008 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": read tcp 10.134.0.2:56938->10.134.0.29:28080: read: connection reset by peer" Apr 17 11:49:42.596199 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:42.596110 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": dial tcp 10.134.0.29:28080: connect: connection refused" Apr 17 11:49:42.596363 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:42.596218 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:49:42.596692 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:42.596664 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": dial tcp 10.134.0.29:28080: connect: connection refused" Apr 17 11:49:42.671387 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:42.671353 2570 generic.go:358] "Generic (PLEG): container finished" podID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerID="ca474316f204e5febe1e98cca9bc7a570d78d790514a315d469fa0bc21bcf91a" exitCode=0 Apr 17 11:49:42.671577 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:42.671403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" event={"ID":"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48","Type":"ContainerDied","Data":"ca474316f204e5febe1e98cca9bc7a570d78d790514a315d469fa0bc21bcf91a"} Apr 17 11:49:43.797863 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:43.797836 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:49:43.934742 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:43.934660 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hzlc\" (UniqueName: \"kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc\") pod \"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48\" (UID: \"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48\") " Apr 17 11:49:43.936841 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:43.936819 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc" (OuterVolumeSpecName: "kube-api-access-4hzlc") pod "48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" (UID: "48db52a5-2a5d-4fb2-94f3-a3a5ad176e48"). InnerVolumeSpecName "kube-api-access-4hzlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:49:44.035184 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:44.035154 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hzlc\" (UniqueName: \"kubernetes.io/projected/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48-kube-api-access-4hzlc\") on node \"ip-10-0-130-210.ec2.internal\" DevicePath \"\"" Apr 17 11:49:44.679371 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:44.679332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" event={"ID":"48db52a5-2a5d-4fb2-94f3-a3a5ad176e48","Type":"ContainerDied","Data":"9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01"} Apr 17 11:49:44.679371 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:44.679376 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd69d1736a0aae1b940d1e20cfc68b9a71688230301b734a1f9d914921dad01" Apr 17 11:49:44.679604 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:49:44.679347 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf" Apr 17 11:52:53.065092 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:52:53.065007 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf"] Apr 17 11:52:53.067954 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:52:53.067932 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-hfmts/progression-min-interval-node-0-0-zbmbf"] Apr 17 11:52:53.092207 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:52:53.092172 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" path="/var/lib/kubelet/pods/48db52a5-2a5d-4fb2-94f3-a3a5ad176e48/volumes" Apr 17 11:53:44.082867 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:44.082839 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qhzzb_93ec16b0-6a48-4435-994b-37438778f1f0/global-pull-secret-syncer/0.log" Apr 17 11:53:44.157531 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:44.157492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hlx77_33c1ecda-fae8-404c-a67f-f189a105cd44/konnectivity-agent/0.log" Apr 17 11:53:44.202237 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:44.202207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-210.ec2.internal_45cba5abf7cd7b00c2f9306a991cc4cf/haproxy/0.log" Apr 17 11:53:47.088168 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.088131 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/alertmanager/0.log" Apr 17 11:53:47.116138 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.116109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/config-reloader/0.log" Apr 17 11:53:47.143534 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.143495 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/kube-rbac-proxy-web/0.log" Apr 17 11:53:47.174827 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.174802 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/kube-rbac-proxy/0.log" Apr 17 11:53:47.201254 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.201228 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/kube-rbac-proxy-metric/0.log" Apr 17 11:53:47.222335 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.222310 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/prom-label-proxy/0.log" Apr 17 11:53:47.248056 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.248029 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ededfcf7-3473-4054-94eb-c59a2a48cdf0/init-config-reloader/0.log" Apr 17 11:53:47.283280 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.283253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-jphln_6a5d3f15-276a-483f-8bff-f93a79e3882e/cluster-monitoring-operator/0.log" Apr 17 11:53:47.582013 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.581988 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjsqn_34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8/node-exporter/0.log" Apr 17 11:53:47.604380 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.604355 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjsqn_34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8/kube-rbac-proxy/0.log" Apr 17 11:53:47.624651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.624631 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjsqn_34e2ff2c-c93b-4f81-b6bd-478fd6c17ae8/init-textfile/0.log" Apr 17 11:53:47.941904 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.941824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4b8f8d4c-pqk8l_1c40b491-1f4d-4dc0-8428-53bc257ebd9d/telemeter-client/0.log" Apr 17 11:53:47.963661 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.963619 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4b8f8d4c-pqk8l_1c40b491-1f4d-4dc0-8428-53bc257ebd9d/reload/0.log" Apr 17 11:53:47.983919 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:47.983900 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f4b8f8d4c-pqk8l_1c40b491-1f4d-4dc0-8428-53bc257ebd9d/kube-rbac-proxy/0.log" Apr 17 11:53:48.013022 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.012994 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/thanos-query/0.log" Apr 17 11:53:48.031982 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.031957 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/kube-rbac-proxy-web/0.log" Apr 17 11:53:48.061509 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.061484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/kube-rbac-proxy/0.log" Apr 17 11:53:48.080210 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.080186 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/prom-label-proxy/0.log" Apr 17 11:53:48.103664 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.103645 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/kube-rbac-proxy-rules/0.log" Apr 17 11:53:48.125348 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:48.125325 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59ccbf6d54-57ct9_6a19e39a-6f59-44dd-ada4-c336a0970663/kube-rbac-proxy-metrics/0.log" Apr 17 11:53:50.053725 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.053650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87d7cfd76-k5sx4_ad2ec323-b2b0-47b9-9107-3a5d8d8f30d0/console/0.log" Apr 17 11:53:50.085396 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.085367 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-k5dth_41962a97-4ad6-4005-b2ed-b2a0a463e7e0/download-server/0.log" Apr 17 11:53:50.813492 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.813456 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr"] Apr 17 11:53:50.813842 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.813828 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" Apr 17 11:53:50.813887 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.813845 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" Apr 17 11:53:50.813920 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.813900 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="48db52a5-2a5d-4fb2-94f3-a3a5ad176e48" containerName="node" Apr 17 11:53:50.816763 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.816740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:50.818764 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.818743 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jjn79\"/\"kube-root-ca.crt\"" Apr 17 11:53:50.818853 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.818788 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jjn79\"/\"openshift-service-ca.crt\"" Apr 17 11:53:50.819371 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.819354 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jjn79\"/\"default-dockercfg-9rt4j\"" Apr 17 11:53:50.823085 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.823053 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr"] Apr 17 11:53:50.909408 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.909368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-proc\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:50.909408 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.909412 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-sys\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:50.909651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.909445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-lib-modules\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:50.909651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.909468 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqf7\" (UniqueName: \"kubernetes.io/projected/cb127228-28e1-471e-865b-f27534fef160-kube-api-access-mqqf7\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:50.909651 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:50.909563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-podres\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010810 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-proc\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010896 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-sys\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010896 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-lib-modules\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010967 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-sys\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010967 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-proc\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.010967 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqf7\" (UniqueName: \"kubernetes.io/projected/cb127228-28e1-471e-865b-f27534fef160-kube-api-access-mqqf7\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.011068 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.010981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-podres\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.011105 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.011086 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-lib-modules\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.011139 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.011092 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb127228-28e1-471e-865b-f27534fef160-podres\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.017598 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.017576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqf7\" (UniqueName: \"kubernetes.io/projected/cb127228-28e1-471e-865b-f27534fef160-kube-api-access-mqqf7\") pod \"perf-node-gather-daemonset-vfxtr\" (UID: \"cb127228-28e1-471e-865b-f27534fef160\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.065723 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.065642 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cxbf8_66f573d5-80a9-4ecf-ad6b-6cf684898a74/dns/0.log" Apr 17 11:53:51.083339 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.083295 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cxbf8_66f573d5-80a9-4ecf-ad6b-6cf684898a74/kube-rbac-proxy/0.log" Apr 17 11:53:51.128399 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.128368 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.140975 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.140932 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7vxqv_57cb4093-bdc6-4637-9afc-7364349a96d4/dns-node-resolver/0.log" Apr 17 11:53:51.249939 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.249903 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr"] Apr 17 11:53:51.253180 ip-10-0-130-210 kubenswrapper[2570]: W0417 11:53:51.253141 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb127228_28e1_471e_865b_f27534fef160.slice/crio-b64f95a8304a98b88cddcbf472d1f6c09ea7f2572bf3b0658c8d622cfa5dfc99 WatchSource:0}: Error finding container b64f95a8304a98b88cddcbf472d1f6c09ea7f2572bf3b0658c8d622cfa5dfc99: Status 404 returned error can't find the container with id b64f95a8304a98b88cddcbf472d1f6c09ea7f2572bf3b0658c8d622cfa5dfc99 Apr 17 11:53:51.254705 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.254689 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:53:51.414730 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.414699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" event={"ID":"cb127228-28e1-471e-865b-f27534fef160","Type":"ContainerStarted","Data":"fb11a519ad9cff9840e85a7d3b666c90cc2998ea42a869f87a9d5abe635a352c"} Apr 17 11:53:51.414730 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.414731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" event={"ID":"cb127228-28e1-471e-865b-f27534fef160","Type":"ContainerStarted","Data":"b64f95a8304a98b88cddcbf472d1f6c09ea7f2572bf3b0658c8d622cfa5dfc99"} Apr 17 11:53:51.414928 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.414756 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:51.428733 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.428683 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" podStartSLOduration=1.428668329 podStartE2EDuration="1.428668329s" podCreationTimestamp="2026-04-17 11:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:53:51.427407595 +0000 UTC m=+1404.955984870" watchObservedRunningTime="2026-04-17 11:53:51.428668329 +0000 UTC m=+1404.957245577" Apr 17 11:53:51.550477 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.550446 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-756549b58b-rdkhs_58df3935-ab94-49e4-9e5e-a716b5374775/registry/0.log" Apr 17 11:53:51.585221 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:51.585147 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ts68p_3a3f50d1-8ddd-40ce-9dee-cc684dedf9d4/node-ca/0.log" Apr 17 11:53:52.567655 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:52.567627 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pdqss_e3bee7d3-b329-44aa-922f-a04ce5b599e7/serve-healthcheck-canary/0.log" Apr 17 11:53:53.014868 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:53.014795 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs4lh_226b68ce-7c71-4b87-a853-b012721e7c68/kube-rbac-proxy/0.log" Apr 17 11:53:53.031233 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:53.031211 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs4lh_226b68ce-7c71-4b87-a853-b012721e7c68/exporter/0.log" Apr 17 11:53:53.047826 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:53.047807 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs4lh_226b68ce-7c71-4b87-a853-b012721e7c68/extractor/0.log" Apr 17 11:53:54.632185 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:54.632140 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-75b9769c54-tl799_598d52f0-87d0-41a0-a0c6-e186bec19773/manager/0.log" Apr 17 11:53:57.428059 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:57.428033 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-vfxtr" Apr 17 11:53:57.766949 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:57.766864 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wv6fb_74b00ba1-c80d-4505-a365-8d7a3c6c7f9a/kube-storage-version-migrator-operator/1.log" Apr 17 11:53:57.767843 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:57.767818 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wv6fb_74b00ba1-c80d-4505-a365-8d7a3c6c7f9a/kube-storage-version-migrator-operator/0.log" Apr 17 11:53:58.758499 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.758469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/kube-multus-additional-cni-plugins/0.log" Apr 17 11:53:58.776242 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.776215 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/egress-router-binary-copy/0.log" Apr 17 11:53:58.794634 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.794609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/cni-plugins/0.log" Apr 17 11:53:58.812107 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.812077 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/bond-cni-plugin/0.log" Apr 17 11:53:58.831188 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.831161 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/routeoverride-cni/0.log" Apr 17 11:53:58.848811 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.848784 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/whereabouts-cni-bincopy/0.log" Apr 17 11:53:58.867649 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.867624 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q2phv_652897df-2286-4fbc-9cf6-a7ce5de5d8a3/whereabouts-cni/0.log" Apr 17 11:53:58.945542 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:58.945488 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mpm2g_fa896c93-030e-48d4-afe6-575b621eca31/kube-multus/0.log" Apr 17 11:53:59.049666 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.049583 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z52nx_343340da-6202-4b41-8b3d-4e0c0f72ecb6/network-metrics-daemon/0.log" Apr 17 11:53:59.067642 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.067613 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z52nx_343340da-6202-4b41-8b3d-4e0c0f72ecb6/kube-rbac-proxy/0.log" Apr 17 11:53:59.880208 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.880157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/ovn-controller/0.log" Apr 17 11:53:59.904506 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.904470 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/ovn-acl-logging/0.log" Apr 17 11:53:59.923062 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.923032 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/kube-rbac-proxy-node/0.log" Apr 17 11:53:59.942877 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.942846 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:53:59.958458 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.958415 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/northd/0.log" Apr 17 11:53:59.988588 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:53:59.988550 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/nbdb/0.log" Apr 17 11:54:00.009044 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:00.009015 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/sbdb/0.log" Apr 17 11:54:00.098352 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:00.098321 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9qrz2_78932d63-d2fd-4c01-8666-7f65f21faaac/ovnkube-controller/0.log" Apr 17 11:54:01.564629 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:01.564606 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-78fsm_431fab13-4e22-4667-b273-df590c4b98bd/check-endpoints/0.log" Apr 17 11:54:01.586890 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:01.586859 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5r2k5_d764f3ad-e076-4b99-8a6f-716b6d83c925/network-check-target-container/0.log" Apr 17 11:54:02.453131 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:02.453103 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xhfhs_bec258cb-b4a0-425e-b582-b392c2becdfe/iptables-alerter/0.log" Apr 17 11:54:03.084433 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:03.084402 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gltrt_5a52dd25-28a3-4e7e-95e7-856ba22c0c17/tuned/0.log" Apr 17 11:54:05.462913 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:05.462883 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-n7ngq_acf5f208-16b3-41eb-b3eb-0b10391e7e74/service-ca-operator/1.log" Apr 17 11:54:05.463339 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:05.463265 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-n7ngq_acf5f208-16b3-41eb-b3eb-0b10391e7e74/service-ca-operator/0.log" Apr 17 11:54:05.725990 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:05.725905 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-cvp9f_3479fb90-7107-42aa-ae3e-b6291ca30d3f/service-ca-controller/0.log" Apr 17 11:54:06.038486 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:06.038416 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-drxgw_bc493885-90a9-4fcc-9331-806c0d60be7d/csi-driver/0.log" Apr 17 11:54:06.056216 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:06.056188 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-drxgw_bc493885-90a9-4fcc-9331-806c0d60be7d/csi-node-driver-registrar/0.log" Apr 17 11:54:06.073711 ip-10-0-130-210 kubenswrapper[2570]: I0417 11:54:06.073674 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-drxgw_bc493885-90a9-4fcc-9331-806c0d60be7d/csi-liveness-probe/0.log"