Apr 22 18:20:49.557819 ip-10-0-143-88 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:20:50.070663 ip-10-0-143-88 kubenswrapper[2551]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:50.070663 ip-10-0-143-88 kubenswrapper[2551]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:20:50.070663 ip-10-0-143-88 kubenswrapper[2551]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:50.070663 ip-10-0-143-88 kubenswrapper[2551]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:20:50.070663 ip-10-0-143-88 kubenswrapper[2551]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:50.071702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.071615 2551 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:20:50.074731 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074712 2551 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:50.074731 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074727 2551 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:50.074731 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074741 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:50.074731 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074745 2551 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074749 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074755 2551 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074758 2551 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074762 2551 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074766 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074769 2551 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074773 2551 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074777 2551 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074780 2551 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074785 2551 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074789 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074792 2551 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074796 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074800 2551 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074804 2551 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074809 2551 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074813 2551 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074817 2551 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074821 2551 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:50.074974 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074825 2551 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074829 2551 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074833 2551 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074837 2551 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074841 2551 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074845 2551 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074848 2551 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074852 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074856 2551 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074859 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074864 2551 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074868 2551 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074873 2551 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074877 2551 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074883 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074888 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074892 2551 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074896 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074900 2551 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074904 2551 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:50.075862 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074908 2551 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074912 2551 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074916 2551 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074920 2551 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074925 2551 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074929 2551 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074933 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074937 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074941 2551 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074945 2551 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074952 2551 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074958 2551 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074963 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074968 2551 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074974 2551 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074978 2551 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074983 2551 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074987 2551 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074991 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:50.076723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074995 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.074999 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075003 2551 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075007 2551 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075012 2551 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075017 2551 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075021 2551 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075026 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075030 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075034 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075039 2551 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075043 2551 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075047 2551 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075051 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075057 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075061 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075066 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075072 2551 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075078 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:50.077593 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075084 2551 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075088 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075092 2551 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075096 2551 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075100 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075774 2551 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075784 2551 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075790 2551 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075794 2551 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075798 2551 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075803 2551 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075808 2551 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075812 2551 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075816 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075821 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075826 2551 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075830 2551 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075835 2551 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075839 2551 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075843 2551 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:50.078097 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075849 2551 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075853 2551 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075857 2551 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075861 2551 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075865 2551 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075869 2551 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075873 2551 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075877 2551 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075880 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075884 2551 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075888 2551 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075892 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075896 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075900 2551 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075905 2551 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075909 2551 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075913 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075917 2551 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075921 2551 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:50.078676 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075925 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075929 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075933 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075937 2551 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075941 2551 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075945 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075949 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075953 2551 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075957 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075961 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075965 2551 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075970 2551 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075974 2551 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075980 2551 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075985 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075989 2551 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075992 2551 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.075996 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076000 2551 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076004 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:50.079278 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076008 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076012 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076016 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076020 2551 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076025 2551 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076029 2551 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076033 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076037 2551 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076041 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076045 2551 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076049 2551 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076055 2551 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076059 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076065 2551 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076071 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076076 2551 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076080 2551 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076084 2551 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076089 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:50.079912 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076093 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076098 2551 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076102 2551 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076106 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076114 2551 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076121 2551 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076126 2551 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076130 2551 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076135 2551 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076139 2551 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076144 2551 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076148 2551 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.076152 2551 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076253 2551 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076279 2551 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076290 2551 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076297 2551 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076304 2551 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076308 2551 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076315 2551 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076322 2551 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:20:50.080399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076327 2551 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076332 2551 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076337 2551 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076343 2551 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076348 2551 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076353 2551 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076357 2551 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076362 2551 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076367 2551 flags.go:64] FLAG: --cloud-config="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076371 2551 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076381 2551 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076388 2551 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076392 2551 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076398 2551 flags.go:64] FLAG: --config-dir="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076402 2551 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076408 2551 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076414 2551 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076419 2551 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076425 2551 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076430 2551 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076435 2551 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076439 2551 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076444 2551 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076448 2551 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076453 2551 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:20:50.081033 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076460 2551 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076465 2551 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076469 2551 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076474 2551 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076478 2551 flags.go:64] FLAG: --enable-server="true" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076483 2551 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076490 2551 flags.go:64] FLAG: --event-burst="100" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076495 2551 flags.go:64] FLAG: --event-qps="50" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076499 2551 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076504 2551 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076509 2551 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076515 2551 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076519 2551 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076524 2551 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076529 2551 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076534 2551 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076539 2551 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076545 2551 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076569 2551 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076574 2551 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076578 2551 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076583 2551 flags.go:64] FLAG: --feature-gates="" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076589 2551 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076594 2551 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076599 2551 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:20:50.081661 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076605 2551 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076610 2551 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076615 2551 flags.go:64] FLAG: --help="false" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076620 2551 flags.go:64] FLAG: --hostname-override="ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076625 2551 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076630 2551 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076634 2551 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076640 2551 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076646 2551 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076651 2551 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076655 2551 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076659 2551 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076664 2551 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076669 2551 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076674 2551 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076679 2551 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076683 2551 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076688 2551 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076693 2551 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076697 2551 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076702 2551 flags.go:64] FLAG: --lock-file="" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076706 2551 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076711 2551 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076717 2551 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:20:50.082290 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076726 2551 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076731 2551 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076736 2551 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076740 2551 flags.go:64] FLAG: --logging-format="text" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076745 2551 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076750 2551 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076755 2551 flags.go:64] FLAG: --manifest-url="" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076759 2551 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076766 2551 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076772 2551 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076778 2551 flags.go:64] FLAG: --max-pods="110" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076782 2551 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076787 2551 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076791 2551 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076796 2551 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076801 2551 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076806 2551 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076811 2551 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076822 2551 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076827 2551 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076832 2551 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076837 2551 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076842 2551 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:20:50.082893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076850 2551 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076855 2551 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076860 2551 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076864 2551 flags.go:64] FLAG: --port="10250" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076868 2551 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076873 2551 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02e1927c695005e27" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076879 2551 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076883 2551 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076888 2551 flags.go:64] FLAG: --register-node="true" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076892 2551 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076900 2551 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076905 2551 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076910 2551 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076914 2551 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076919 2551 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076925 2551 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076932 2551 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076936 2551 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076941 2551 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076946 2551 flags.go:64] FLAG: --runonce="false" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076951 2551 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076956 2551 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076961 2551 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076966 2551 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076970 2551 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076975 2551 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:20:50.083450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076979 2551 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076984 2551 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076989 2551 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.076994 2551 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077000 2551 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077005 2551 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077011 2551 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077015 2551 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077020 2551 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077028 2551 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077033 2551 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077038 2551 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077044 2551 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077048 2551 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077053 2551 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077058 2551 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077064 2551 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077069 2551 flags.go:64] FLAG: --v="2" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077079 2551 flags.go:64] FLAG: --version="false" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077085 2551 flags.go:64] FLAG: --vmodule="" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077092 2551 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.077097 2551 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077248 2551 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077255 2551 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077259 2551 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:50.084111 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077263 2551 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077268 2551 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077272 2551 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077277 2551 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077281 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077285 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077289 2551 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077293 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077297 2551 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077302 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077306 2551 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077310 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077314 2551 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077318 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077322 2551 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077326 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077330 2551 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077334 2551 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077338 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077342 2551 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:50.084723 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077346 2551 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077350 2551 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077354 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077360 2551 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077365 2551 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077369 2551 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077373 2551 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077377 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077381 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077393 2551 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077399 2551 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077404 2551 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077408 2551 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077413 2551 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077418 2551 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077422 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077426 2551 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077430 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077434 2551 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077438 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:50.085259 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077442 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077446 2551 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077450 2551 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077454 2551 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077458 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077462 2551 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077466 2551 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077471 2551 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077475 2551 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077479 2551 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077483 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077487 2551 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077491 2551 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077495 2551 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077499 2551 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077505 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077509 2551 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077513 2551 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077517 2551 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077521 2551 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:50.085755 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077525 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077530 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077535 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077539 2551 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077543 2551 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077569 2551 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077574 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077579 2551 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077583 2551 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077588 2551 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077591 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077596 2551 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077600 2551 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077604 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077608 2551 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077612 2551 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077618 2551 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077624 2551 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:50.086251 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077629 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077633 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077638 2551 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077642 2551 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.077647 2551 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.078356 2551 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.085132 2551 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.085148 2551 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085216 2551 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085224 2551 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085227 2551 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085231 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085234 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085237 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085240 2551 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:50.086702 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085242 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085245 2551 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085247 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085250 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085253 2551 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085256 2551 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085258 2551 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085261 2551 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085264 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085266 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085269 2551 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085272 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085275 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085277 2551 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085279 2551 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085282 2551 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085285 2551 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085287 2551 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085290 2551 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085292 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:50.087172 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085295 2551 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085298 2551 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085300 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085302 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085306 2551 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085309 2551 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085312 2551 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085314 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085317 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085319 2551 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085322 2551 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085324 2551 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085326 2551 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085329 2551 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085331 2551 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085333 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085336 2551 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085339 2551 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085341 2551 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085343 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:50.087677 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085346 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085348 2551 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085351 2551 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085353 2551 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085356 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085358 2551 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085361 2551 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085363 2551 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085366 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085368 2551 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085370 2551 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085373 2551 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085375 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085378 2551 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085380 2551 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085382 2551 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085386 2551 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085389 2551 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085392 2551 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085395 2551 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085397 2551 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:50.088158 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085400 2551 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085402 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085404 2551 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085407 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085409 2551 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085412 2551 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085414 2551 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085416 2551 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085419 2551 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085423 2551 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085426 2551 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085429 2551 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085431 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085434 2551 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085437 2551 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085439 2551 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085442 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:50.088742 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085444 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.085449 2551 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085542 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085546 2551 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085564 2551 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085567 2551 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085571 2551 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085575 2551 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085578 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085581 2551 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085584 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085586 2551 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085589 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085592 2551 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085594 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:50.089154 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085597 2551 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085599 2551 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085602 2551 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085604 2551 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085606 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085609 2551 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085611 2551 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085614 2551 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085616 2551 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085619 2551 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085621 2551 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085623 2551 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085626 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085628 2551 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085631 2551 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085633 2551 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085636 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085638 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085641 2551 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085643 2551 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:50.089525 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085646 2551 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085649 2551 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085651 2551 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085654 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085656 2551 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085658 2551 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085661 2551 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085663 2551 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085666 2551 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085668 2551 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085671 2551 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085674 2551 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085676 2551 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085678 2551 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085681 2551 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085684 2551 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085686 2551 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085688 2551 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085691 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085693 2551 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:50.090066 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085695 2551 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085698 2551 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085700 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085703 2551 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085705 2551 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085708 2551 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085710 2551 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085712 2551 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085715 2551 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085718 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085720 2551 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085722 2551 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085725 2551 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085727 2551 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085730 2551 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085733 2551 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085736 2551 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085739 2551 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085741 2551 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085744 2551 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:50.090570 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085747 2551 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085749 2551 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085752 2551 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085755 2551 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085757 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085760 2551 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085762 2551 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085765 2551 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085768 2551 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085770 2551 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085773 2551 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085775 2551 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:50.085778 2551 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.085782 2551 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.086623 2551 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:20:50.091061 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.088611 2551 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:20:50.091427 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.089637 2551 server.go:1019] "Starting client certificate rotation" Apr 22 18:20:50.091427 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.089727 2551 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:50.091427 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.091331 2551 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:50.120185 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.120166 2551 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:50.126419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.126397 2551 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:50.142193 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.142171 2551 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:20:50.148243 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.148224 2551 log.go:25] "Validated CRI v1 image API" Apr 22 18:20:50.149477 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.149464 2551 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:20:50.157245 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.157221 2551 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9bbf32e2-d79a-4ab5-82ab-4ed0d054ef46:/dev/nvme0n1p4 d5c5c2c7-1b19-4ef8-a7bf-9f7c9378b94a:/dev/nvme0n1p3] Apr 22 18:20:50.157304 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.157245 2551 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:20:50.163336 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163229 2551 manager.go:217] Machine: {Timestamp:2026-04-22 18:20:50.160886062 +0000 UTC m=+0.468477368 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097447 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23e3bdeff0d9a1a5e7f0ed245041f5 SystemUUID:ec23e3bd-eff0-d9a1-a5e7-f0ed245041f5 BootID:fdd21b06-5598-4b92-8b34-abba80017605 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:20:06:74:8a:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:20:06:74:8a:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:9b:14:6a:eb:57 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:20:50.163336 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163332 2551 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:20:50.163429 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163413 2551 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:20:50.163764 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163741 2551 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:20:50.163896 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163765 2551 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-88.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:20:50.163937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163905 2551 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:20:50.163937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163913 2551 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:20:50.163937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.163927 2551 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:50.165020 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.165010 2551 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:50.166587 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.166575 2551 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:50.166691 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.166682 2551 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:20:50.170149 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.170140 2551 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:20:50.170186 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.170153 2551 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:20:50.170186 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.170165 2551 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:20:50.170186 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.170174 2551 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:20:50.170186 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.170183 2551 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:20:50.171390 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.171378 2551 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:50.171435 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.171396 2551 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:50.173935 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.173916 2551 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:50.174690 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.174673 2551 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:20:50.176578 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.176565 2551 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:20:50.178127 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178106 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178134 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178145 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178154 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178162 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178171 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178180 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178195 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178206 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:20:50.178217 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178216 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:20:50.178593 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178228 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:20:50.178593 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.178241 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:20:50.179291 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.179275 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:20:50.179359 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.179298 2551 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:20:50.183145 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.183129 2551 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:20:50.183229 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.183166 2551 server.go:1295] "Started kubelet" Apr 22 18:20:50.183283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.183203 2551 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:20:50.183355 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.183317 2551 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:20:50.183406 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.183373 2551 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:20:50.184046 ip-10-0-143-88 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:20:50.184938 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.184816 2551 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-88.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:20:50.184938 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.184933 2551 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:20:50.185240 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.185226 2551 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:20:50.187746 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.187721 2551 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:20:50.187869 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.187850 2551 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:20:50.189884 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.189867 2551 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:50.191137 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.191124 2551 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:20:50.191943 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.191925 2551 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:20:50.192129 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.192108 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.193484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.193461 2551 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:20:50.193620 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.193608 2551 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:20:50.193851 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.193839 2551 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:20:50.193953 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.193943 2551 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:20:50.194131 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194020 2551 factory.go:55] Registering systemd factory Apr 22 18:20:50.194131 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194133 2551 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:20:50.194613 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194518 2551 factory.go:153] Registering CRI-O factory Apr 22 18:20:50.194613 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194538 2551 factory.go:223] Registration of the crio container factory successfully Apr 22 18:20:50.194710 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194631 2551 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:20:50.194710 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194672 2551 factory.go:103] Registering Raw factory Apr 22 18:20:50.194710 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.194691 2551 manager.go:1196] Started watching for new ooms in manager Apr 22 18:20:50.195146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.195132 2551 manager.go:319] Starting recovery of all containers Apr 22 18:20:50.195654 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.195620 2551 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:20:50.199307 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.199144 2551 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:20:50.199307 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.199231 2551 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:20:50.200241 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.199139 2551 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-88.ec2.internal.18a8c0cf70ef1f1f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-88.ec2.internal,UID:ip-10-0-143-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-88.ec2.internal,},FirstTimestamp:2026-04-22 18:20:50.183143199 +0000 UTC m=+0.490734504,LastTimestamp:2026-04-22 18:20:50.183143199 +0000 UTC m=+0.490734504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-88.ec2.internal,}" Apr 22 18:20:50.205642 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.205629 2551 manager.go:324] Recovery completed Apr 22 18:20:50.210421 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.210407 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.215480 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.215460 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.215543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.215497 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.215543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.215509 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.216014 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.215998 2551 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:20:50.216066 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.216015 2551 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:20:50.216066 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.216041 2551 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:50.218080 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.218018 2551 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-88.ec2.internal.18a8c0cf72dc87a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-88.ec2.internal,UID:ip-10-0-143-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-88.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-88.ec2.internal,},FirstTimestamp:2026-04-22 18:20:50.215479204 +0000 UTC m=+0.523070515,LastTimestamp:2026-04-22 18:20:50.215479204 +0000 UTC m=+0.523070515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-88.ec2.internal,}" Apr 22 18:20:50.218300 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.218290 2551 policy_none.go:49] "None policy: Start" Apr 22 18:20:50.218330 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.218308 2551 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:20:50.218330 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.218319 2551 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:20:50.227996 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.227940 2551 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-88.ec2.internal.18a8c0cf72dce716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-88.ec2.internal,UID:ip-10-0-143-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-88.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-88.ec2.internal,},FirstTimestamp:2026-04-22 18:20:50.215503638 +0000 UTC m=+0.523094944,LastTimestamp:2026-04-22 18:20:50.215503638 +0000 UTC m=+0.523094944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-88.ec2.internal,}" Apr 22 18:20:50.240496 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.240436 2551 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-88.ec2.internal.18a8c0cf72dd10dd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-88.ec2.internal,UID:ip-10-0-143-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-143-88.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-143-88.ec2.internal,},FirstTimestamp:2026-04-22 18:20:50.215514333 +0000 UTC m=+0.523105639,LastTimestamp:2026-04-22 18:20:50.215514333 +0000 UTC m=+0.523105639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-88.ec2.internal,}" Apr 22 18:20:50.261990 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.261971 2551 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wzwj9" Apr 22 18:20:50.264185 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264170 2551 manager.go:341] "Starting Device Plugin manager" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.264208 2551 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264217 2551 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wzwj9" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264222 2551 server.go:85] "Starting device plugin registration server" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264482 2551 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264495 2551 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264589 2551 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264660 2551 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.264668 2551 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.265316 2551 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:20:50.279087 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.265351 2551 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.339882 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.339848 2551 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:20:50.341172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.341155 2551 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:20:50.341261 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.341184 2551 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:20:50.341261 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.341202 2551 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:20:50.341261 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.341208 2551 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:20:50.341261 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.341240 2551 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:20:50.345246 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.345223 2551 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:50.364609 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.364580 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.365614 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.365599 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.365700 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.365632 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.365700 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.365646 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.365700 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.365675 2551 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.373371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.373357 2551 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.373437 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.373377 2551 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-88.ec2.internal\": node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.407704 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.407679 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.442195 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.442173 2551 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal"] Apr 22 18:20:50.442273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.442244 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.443204 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.443190 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.443285 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.443229 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.443285 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.443244 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.444536 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.444522 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.444674 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.444660 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.444721 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.444690 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.446305 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446290 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.446305 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446298 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.446433 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446320 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.446433 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446325 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.446433 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446341 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.446433 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.446359 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.447645 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.447630 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.447695 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.447663 2551 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:50.449388 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.449364 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:50.449388 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.449382 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:50.449388 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.449391 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:50.470117 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.470095 2551 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-88.ec2.internal\" not found" node="ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.474455 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.474442 2551 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-88.ec2.internal\" not found" node="ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.495781 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.495761 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.495835 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.495793 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.495835 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.495812 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34174f385361dd3a899cc07554d23095-config\") pod \"kube-apiserver-proxy-ip-10-0-143-88.ec2.internal\" (UID: \"34174f385361dd3a899cc07554d23095\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.507785 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.507766 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.596369 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596302 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.596369 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596331 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.596369 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596350 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34174f385361dd3a899cc07554d23095-config\") pod \"kube-apiserver-proxy-ip-10-0-143-88.ec2.internal\" (UID: \"34174f385361dd3a899cc07554d23095\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.596518 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596396 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.596518 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596405 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cb2a87aae6f4cdbc0ed88f11045cd61-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal\" (UID: \"4cb2a87aae6f4cdbc0ed88f11045cd61\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.596518 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.596435 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/34174f385361dd3a899cc07554d23095-config\") pod \"kube-apiserver-proxy-ip-10-0-143-88.ec2.internal\" (UID: \"34174f385361dd3a899cc07554d23095\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.608426 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.608406 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.708519 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.708495 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.772735 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.772706 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.777317 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:50.777301 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:50.809323 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.809295 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:50.909919 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:50.909848 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:51.010346 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:51.010316 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:51.089568 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.089513 2551 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:20:51.090147 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.089707 2551 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:51.111019 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:51.110992 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:51.190295 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.190231 2551 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:51.205876 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.205853 2551 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:51.208794 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.208771 2551 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:51.211222 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:51.211201 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:51.245168 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:51.245135 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34174f385361dd3a899cc07554d23095.slice/crio-1e64a95dc82c575ffb3450734d03c4e723a862b9b66318f43577c0aec1a34d95 WatchSource:0}: Error finding container 1e64a95dc82c575ffb3450734d03c4e723a862b9b66318f43577c0aec1a34d95: Status 404 returned error can't find the container with id 1e64a95dc82c575ffb3450734d03c4e723a862b9b66318f43577c0aec1a34d95 Apr 22 18:20:51.245600 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:51.245580 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cb2a87aae6f4cdbc0ed88f11045cd61.slice/crio-f31d7035151125d99fc14bf3ffdcd2d81d90da453c9e3f4b151fc64b028afe2a WatchSource:0}: Error finding container f31d7035151125d99fc14bf3ffdcd2d81d90da453c9e3f4b151fc64b028afe2a: Status 404 returned error can't find the container with id f31d7035151125d99fc14bf3ffdcd2d81d90da453c9e3f4b151fc64b028afe2a Apr 22 18:20:51.249629 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.249615 2551 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:20:51.266438 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.266413 2551 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:15:50 +0000 UTC" deadline="2027-10-25 04:12:09.820932439 +0000 UTC" Apr 22 18:20:51.266438 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.266437 2551 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13209h51m18.554498428s" Apr 22 18:20:51.311975 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:51.311951 2551 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-88.ec2.internal\" not found" Apr 22 18:20:51.326408 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.326388 2551 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:51.326610 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.326592 2551 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kjqt5" Apr 22 18:20:51.333599 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.333570 2551 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kjqt5" Apr 22 18:20:51.344601 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.344540 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" event={"ID":"34174f385361dd3a899cc07554d23095","Type":"ContainerStarted","Data":"1e64a95dc82c575ffb3450734d03c4e723a862b9b66318f43577c0aec1a34d95"} Apr 22 18:20:51.345352 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.345330 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" event={"ID":"4cb2a87aae6f4cdbc0ed88f11045cd61","Type":"ContainerStarted","Data":"f31d7035151125d99fc14bf3ffdcd2d81d90da453c9e3f4b151fc64b028afe2a"} Apr 22 18:20:51.391499 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.391474 2551 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" Apr 22 18:20:51.408189 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.408173 2551 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:51.410060 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.410048 2551 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" Apr 22 18:20:51.427150 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.427122 2551 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:51.626093 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:51.625901 2551 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:52.171807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.171784 2551 apiserver.go:52] "Watching apiserver" Apr 22 18:20:52.180357 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.180332 2551 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:20:52.180802 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.180779 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-49bjb","openshift-network-diagnostics/network-check-target-2h4lz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln","openshift-image-registry/node-ca-vs5cv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal","openshift-multus/multus-additional-cni-plugins-rz6nj","openshift-multus/multus-dskdj","openshift-multus/network-metrics-daemon-vvfq2","openshift-network-operator/iptables-alerter-fkl7j","openshift-ovn-kubernetes/ovnkube-node-gz6xk","kube-system/konnectivity-agent-z9whf","kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal","openshift-cluster-node-tuning-operator/tuned-nrnpx"] Apr 22 18:20:52.183776 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.183385 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.184429 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.184400 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:52.184518 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.184481 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:20:52.185577 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.185471 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.185577 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.185566 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.185769 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.185749 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.186066 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.185907 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:20:52.186193 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.186175 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.186259 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.186205 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dpjth\"" Apr 22 18:20:52.186259 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.186238 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:20:52.187010 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.186989 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.189521 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.189500 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.189644 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.189586 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:20:52.190114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190097 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190324 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190354 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190375 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190392 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190427 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-c4q6z\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190480 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:20:52.190514 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190507 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:20:52.190898 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190375 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.190898 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190709 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.190898 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.190818 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.191175 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.191153 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-245xn\"" Apr 22 18:20:52.192857 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.192838 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.195813 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.195789 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.195904 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.195813 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.205713 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205642 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-slash\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.205713 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205675 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-netns\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.205713 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205701 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ebef991-9163-4261-ab37-3aa25f981e43-host\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205728 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-ovn\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205769 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-netd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205793 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-cni-binary-copy\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205815 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-conf-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205837 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-os-release\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205862 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.205891 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205885 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4gp\" (UniqueName: \"kubernetes.io/projected/c9321020-7781-4341-88ff-61119a578087-kube-api-access-2s4gp\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205908 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebef991-9163-4261-ab37-3aa25f981e43-serviceca\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205940 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-etc-kubernetes\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.205964 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-sys-fs\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206002 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206037 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-script-lib\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206062 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c818e9ac-a0e9-433e-9277-17075adf6903-tmp-dir\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206086 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-kubelet\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206109 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-hostroot\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206133 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-etc-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206164 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.206183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206186 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-socket-dir-parent\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206201 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-multus-daemon-config\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206230 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c818e9ac-a0e9-433e-9277-17075adf6903-hosts-file\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206248 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-device-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206266 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-multus\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206280 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206294 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206319 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2nj4\" (UniqueName: \"kubernetes.io/projected/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kube-api-access-l2nj4\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206332 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-cnibin\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206349 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-kubelet\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206373 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206399 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqzrp\" (UniqueName: \"kubernetes.io/projected/5ebef991-9163-4261-ab37-3aa25f981e43-kube-api-access-xqzrp\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206416 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-k8s-cni-cncf-io\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206429 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-config\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206444 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-systemd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206457 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-node-log\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.206485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206471 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206486 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-bin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206499 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-multus-certs\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206513 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9n8\" (UniqueName: \"kubernetes.io/projected/3c510803-84b3-405a-93b2-ef05315ce43a-kube-api-access-gr9n8\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206528 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206544 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206583 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-system-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206602 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-os-release\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206623 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-systemd-units\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206638 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-binary-copy\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206664 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovn-node-metrics-cert\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206684 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e3da756-d433-4424-94e8-89918c5586cd-host-slash\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206702 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-socket-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206734 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcr8w\" (UniqueName: \"kubernetes.io/projected/f76d2d65-0522-4be1-946b-e6c684435f2d-kube-api-access-tcr8w\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206755 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ntwc\" (UniqueName: \"kubernetes.io/projected/9e3da756-d433-4424-94e8-89918c5586cd-kube-api-access-7ntwc\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206778 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-system-cni-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.207107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206799 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-var-lib-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206819 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206840 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-env-overrides\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206864 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-registration-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206886 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206913 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-cnibin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206928 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-log-socket\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206950 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lz5m\" (UniqueName: \"kubernetes.io/projected/9ee7d206-a285-4a35-a05b-5ea4d067ba50-kube-api-access-4lz5m\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.206982 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-bin\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.207031 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d6f\" (UniqueName: \"kubernetes.io/projected/c818e9ac-a0e9-433e-9277-17075adf6903-kube-api-access-z8d6f\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.207058 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-netns\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.207666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.207079 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e3da756-d433-4424-94e8-89918c5586cd-iptables-alerter-script\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.209794 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.209742 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.212441 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.209749 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.212441 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.210448 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:20:52.212441 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.210495 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xbbvw\"" Apr 22 18:20:52.212441 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.210877 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:20:52.220229 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.220209 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.220689 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.220744 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q64zh\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.220747 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.220845 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221178 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.221283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221232 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.221693 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221658 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:52.221693 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221679 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:20:52.221693 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221679 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l4w5z\"" Apr 22 18:20:52.221851 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221742 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jts9f\"" Apr 22 18:20:52.221851 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221756 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kqgx2\"" Apr 22 18:20:52.221851 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221686 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:20:52.221851 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221822 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:52.221997 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.221933 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:20:52.223479 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.223459 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2w9sz\"" Apr 22 18:20:52.292572 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.292531 2551 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:20:52.292907 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.292891 2551 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:52.307284 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307257 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ntwc\" (UniqueName: \"kubernetes.io/projected/9e3da756-d433-4424-94e8-89918c5586cd-kube-api-access-7ntwc\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307290 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-system-cni-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307307 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-var-lib-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307351 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307369 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-env-overrides\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307385 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-registration-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307389 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-system-cni-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307400 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.307419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307419 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-cnibin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307436 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-log-socket\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307508 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lz5m\" (UniqueName: \"kubernetes.io/projected/9ee7d206-a285-4a35-a05b-5ea4d067ba50-kube-api-access-4lz5m\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307534 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-var-lib-kubelet\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307575 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-bin\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307596 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d6f\" (UniqueName: \"kubernetes.io/projected/c818e9ac-a0e9-433e-9277-17075adf6903-kube-api-access-z8d6f\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307613 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-modprobe-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307639 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307656 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-tmp\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307670 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-agent-certs\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.307694 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307686 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-netns\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307702 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e3da756-d433-4424-94e8-89918c5586cd-iptables-alerter-script\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307721 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-slash\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307726 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-var-lib-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307736 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-netns\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307761 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ebef991-9163-4261-ab37-3aa25f981e43-host\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307776 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-run\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307802 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6tw\" (UniqueName: \"kubernetes.io/projected/bfcb0720-c589-4598-b75b-3f5d1ec714d3-kube-api-access-4w6tw\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307822 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-ovn\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307838 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-netd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307863 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-cni-binary-copy\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307881 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-conf-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307899 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-os-release\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307917 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.307977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307927 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-env-overrides\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307992 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-cnibin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308021 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-registration-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308038 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-log-socket\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.307933 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4gp\" (UniqueName: \"kubernetes.io/projected/c9321020-7781-4341-88ff-61119a578087-kube-api-access-2s4gp\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308071 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308075 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebef991-9163-4261-ab37-3aa25f981e43-serviceca\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308106 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-etc-kubernetes\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308133 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-sys-fs\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308137 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-etc-kubernetes\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308163 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308186 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-sys-fs\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308266 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-ovn\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308281 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-bin\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308320 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ebef991-9163-4261-ab37-3aa25f981e43-host\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.308372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308351 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-cni-netd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308383 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-conf-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308435 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-os-release\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308466 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-run-netns\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308651 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebef991-9163-4261-ab37-3aa25f981e43-serviceca\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308701 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308756 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-script-lib\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308775 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c818e9ac-a0e9-433e-9277-17075adf6903-tmp-dir\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308793 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-conf\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308804 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-cni-binary-copy\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308815 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-systemd\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308830 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-host\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308868 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-kubelet\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308886 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-hostroot\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308914 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-etc-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.308934 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308928 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308954 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-tuned\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308972 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-konnectivity-ca\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308989 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-netns\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308989 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309006 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-socket-dir-parent\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.308952 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309054 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-slash\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309090 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-kubelet\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.309091 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309118 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-hostroot\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309137 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-multus-daemon-config\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309185 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-etc-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.309206 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:20:52.809152849 +0000 UTC m=+3.116744163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309321 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-multus-socket-dir-parent\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309355 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c818e9ac-a0e9-433e-9277-17075adf6903-tmp-dir\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309362 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c818e9ac-a0e9-433e-9277-17075adf6903-hosts-file\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.309540 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309393 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-device-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309434 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-multus\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309468 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309483 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c818e9ac-a0e9-433e-9277-17075adf6903-hosts-file\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309486 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-multus\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309515 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-device-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309534 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309577 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-script-lib\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309589 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2nj4\" (UniqueName: \"kubernetes.io/projected/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kube-api-access-l2nj4\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309595 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309599 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309581 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e3da756-d433-4424-94e8-89918c5586cd-iptables-alerter-script\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309636 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-cnibin\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309664 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-kubelet\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309693 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-kubelet\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309692 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309712 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c510803-84b3-405a-93b2-ef05315ce43a-multus-daemon-config\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309725 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9321020-7781-4341-88ff-61119a578087-cnibin\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309732 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqzrp\" (UniqueName: \"kubernetes.io/projected/5ebef991-9163-4261-ab37-3aa25f981e43-kube-api-access-xqzrp\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309770 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-kubernetes\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309879 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-k8s-cni-cncf-io\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309896 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-openvswitch\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309928 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-config\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309955 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-k8s-cni-cncf-io\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.309991 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-sys\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310009 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-lib-modules\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310029 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-systemd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310045 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-node-log\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310098 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-run-systemd\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310106 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-node-log\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310141 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310183 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-bin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310319 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-multus-certs\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310339 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-var-lib-cni-bin\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310376 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9n8\" (UniqueName: \"kubernetes.io/projected/3c510803-84b3-405a-93b2-ef05315ce43a-kube-api-access-gr9n8\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.310724 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310404 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-host-run-multus-certs\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310435 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310453 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysconfig\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310428 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovnkube-config\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310491 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310528 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310532 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-system-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310589 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-os-release\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310603 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-system-cni-dir\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310637 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-systemd-units\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310672 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-binary-copy\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310694 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c510803-84b3-405a-93b2-ef05315ce43a-os-release\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310716 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ee7d206-a285-4a35-a05b-5ea4d067ba50-systemd-units\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310719 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovn-node-metrics-cert\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310759 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e3da756-d433-4424-94e8-89918c5586cd-host-slash\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310793 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-socket-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310819 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcr8w\" (UniqueName: \"kubernetes.io/projected/f76d2d65-0522-4be1-946b-e6c684435f2d-kube-api-access-tcr8w\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.311248 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310908 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.311752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310916 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1dc684e4-5e8a-4c53-8f22-806b580c14f9-socket-dir\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.311752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310912 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e3da756-d433-4424-94e8-89918c5586cd-host-slash\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.311752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.310994 2551 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:20:52.311752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.311133 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9321020-7781-4341-88ff-61119a578087-cni-binary-copy\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.313883 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.313865 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ee7d206-a285-4a35-a05b-5ea4d067ba50-ovn-node-metrics-cert\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.324950 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.324924 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:52.324950 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.324952 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:52.325098 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.324965 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:52.325098 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.325030 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:52.82501271 +0000 UTC m=+3.132604007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:52.326595 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.326240 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ntwc\" (UniqueName: \"kubernetes.io/projected/9e3da756-d433-4424-94e8-89918c5586cd-kube-api-access-7ntwc\") pod \"iptables-alerter-fkl7j\" (UID: \"9e3da756-d433-4424-94e8-89918c5586cd\") " pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.326771 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.326748 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9n8\" (UniqueName: \"kubernetes.io/projected/3c510803-84b3-405a-93b2-ef05315ce43a-kube-api-access-gr9n8\") pod \"multus-dskdj\" (UID: \"3c510803-84b3-405a-93b2-ef05315ce43a\") " pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.327382 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.327357 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2nj4\" (UniqueName: \"kubernetes.io/projected/1dc684e4-5e8a-4c53-8f22-806b580c14f9-kube-api-access-l2nj4\") pod \"aws-ebs-csi-driver-node-b9tln\" (UID: \"1dc684e4-5e8a-4c53-8f22-806b580c14f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.327575 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.327480 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lz5m\" (UniqueName: \"kubernetes.io/projected/9ee7d206-a285-4a35-a05b-5ea4d067ba50-kube-api-access-4lz5m\") pod \"ovnkube-node-gz6xk\" (UID: \"9ee7d206-a285-4a35-a05b-5ea4d067ba50\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.328339 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.328243 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcr8w\" (UniqueName: \"kubernetes.io/projected/f76d2d65-0522-4be1-946b-e6c684435f2d-kube-api-access-tcr8w\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.328821 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.328796 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d6f\" (UniqueName: \"kubernetes.io/projected/c818e9ac-a0e9-433e-9277-17075adf6903-kube-api-access-z8d6f\") pod \"node-resolver-49bjb\" (UID: \"c818e9ac-a0e9-433e-9277-17075adf6903\") " pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.328983 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.328966 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4gp\" (UniqueName: \"kubernetes.io/projected/c9321020-7781-4341-88ff-61119a578087-kube-api-access-2s4gp\") pod \"multus-additional-cni-plugins-rz6nj\" (UID: \"c9321020-7781-4341-88ff-61119a578087\") " pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.329065 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.329006 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqzrp\" (UniqueName: \"kubernetes.io/projected/5ebef991-9163-4261-ab37-3aa25f981e43-kube-api-access-xqzrp\") pod \"node-ca-vs5cv\" (UID: \"5ebef991-9163-4261-ab37-3aa25f981e43\") " pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.335253 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.335223 2551 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:51 +0000 UTC" deadline="2027-12-04 04:26:55.810577122 +0000 UTC" Apr 22 18:20:52.335345 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.335253 2551 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14170h6m3.475327843s" Apr 22 18:20:52.411322 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411289 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-modprobe-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411329 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411387 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-tmp\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411418 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-agent-certs\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.411484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411444 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-run\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411484 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411466 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6tw\" (UniqueName: \"kubernetes.io/projected/bfcb0720-c589-4598-b75b-3f5d1ec714d3-kube-api-access-4w6tw\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411486 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411487 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-modprobe-d\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411498 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-conf\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411544 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-run\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411574 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-systemd\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411602 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-host\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411642 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-tuned\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411644 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysctl-conf\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411668 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-konnectivity-ca\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411709 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-kubernetes\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411722 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-host\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411734 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-sys\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.411753 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411751 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-systemd\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411757 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-lib-modules\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411849 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-lib-modules\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411860 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysconfig\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411862 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-kubernetes\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411897 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-sysconfig\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411894 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-sys\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411919 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-var-lib-kubelet\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.411987 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfcb0720-c589-4598-b75b-3f5d1ec714d3-var-lib-kubelet\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.412878 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.412855 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-konnectivity-ca\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.413764 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.413744 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-tmp\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.414089 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.414067 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba-agent-certs\") pod \"konnectivity-agent-z9whf\" (UID: \"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba\") " pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.414235 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.414214 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfcb0720-c589-4598-b75b-3f5d1ec714d3-etc-tuned\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.423593 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.423525 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6tw\" (UniqueName: \"kubernetes.io/projected/bfcb0720-c589-4598-b75b-3f5d1ec714d3-kube-api-access-4w6tw\") pod \"tuned-nrnpx\" (UID: \"bfcb0720-c589-4598-b75b-3f5d1ec714d3\") " pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.495699 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.495675 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dskdj" Apr 22 18:20:52.502361 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.502342 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" Apr 22 18:20:52.510836 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.510817 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vs5cv" Apr 22 18:20:52.518301 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.518282 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" Apr 22 18:20:52.524522 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.524499 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-49bjb" Apr 22 18:20:52.532056 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.532041 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkl7j" Apr 22 18:20:52.537697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.537678 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:20:52.543494 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.543477 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:20:52.549038 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.549019 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" Apr 22 18:20:52.814373 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.814303 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:52.814493 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.814444 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:52.814538 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.814510 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:20:53.814495966 +0000 UTC m=+4.122087261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:52.842730 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.842697 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c510803_84b3_405a_93b2_ef05315ce43a.slice/crio-9dca38a3a792a2a8165c0c587fd525048cf368fde5b17be7a91342e9af0f5327 WatchSource:0}: Error finding container 9dca38a3a792a2a8165c0c587fd525048cf368fde5b17be7a91342e9af0f5327: Status 404 returned error can't find the container with id 9dca38a3a792a2a8165c0c587fd525048cf368fde5b17be7a91342e9af0f5327 Apr 22 18:20:52.844826 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.844799 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc684e4_5e8a_4c53_8f22_806b580c14f9.slice/crio-7df049e01b8e6f2288cac75a5624e28e0caafbf6d75ed72a1670e7caf9d4b7c6 WatchSource:0}: Error finding container 7df049e01b8e6f2288cac75a5624e28e0caafbf6d75ed72a1670e7caf9d4b7c6: Status 404 returned error can't find the container with id 7df049e01b8e6f2288cac75a5624e28e0caafbf6d75ed72a1670e7caf9d4b7c6 Apr 22 18:20:52.849482 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.849459 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc818e9ac_a0e9_433e_9277_17075adf6903.slice/crio-0f1017ba281d1f8f38e81941256803aa66ba2a86c1f96d58d0924cd49c9194c3 WatchSource:0}: Error finding container 0f1017ba281d1f8f38e81941256803aa66ba2a86c1f96d58d0924cd49c9194c3: Status 404 returned error can't find the container with id 0f1017ba281d1f8f38e81941256803aa66ba2a86c1f96d58d0924cd49c9194c3 Apr 22 18:20:52.850330 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.850309 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ebef991_9163_4261_ab37_3aa25f981e43.slice/crio-ad30f3f851a4dcd6c83612f34359264184f063aad811853b27bb157a01f7efe4 WatchSource:0}: Error finding container ad30f3f851a4dcd6c83612f34359264184f063aad811853b27bb157a01f7efe4: Status 404 returned error can't find the container with id ad30f3f851a4dcd6c83612f34359264184f063aad811853b27bb157a01f7efe4 Apr 22 18:20:52.851003 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.850868 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfcb0720_c589_4598_b75b_3f5d1ec714d3.slice/crio-8c5d85acfd8124f237a63fe0fc34c4f0d5b8371ae28aacd686c5ecf52d209984 WatchSource:0}: Error finding container 8c5d85acfd8124f237a63fe0fc34c4f0d5b8371ae28aacd686c5ecf52d209984: Status 404 returned error can't find the container with id 8c5d85acfd8124f237a63fe0fc34c4f0d5b8371ae28aacd686c5ecf52d209984 Apr 22 18:20:52.851762 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.851738 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee7d206_a285_4a35_a05b_5ea4d067ba50.slice/crio-2e4dc2e308bd60571308c2186bf0815a63f6f67c351e6c7fd187779fd3bc8745 WatchSource:0}: Error finding container 2e4dc2e308bd60571308c2186bf0815a63f6f67c351e6c7fd187779fd3bc8745: Status 404 returned error can't find the container with id 2e4dc2e308bd60571308c2186bf0815a63f6f67c351e6c7fd187779fd3bc8745 Apr 22 18:20:52.853086 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.853055 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e3da756_d433_4424_94e8_89918c5586cd.slice/crio-42d9a1dd5d81268b55c0f4cc62bf0cb3303f9bed3ea2acba65c2512406c90e09 WatchSource:0}: Error finding container 42d9a1dd5d81268b55c0f4cc62bf0cb3303f9bed3ea2acba65c2512406c90e09: Status 404 returned error can't find the container with id 42d9a1dd5d81268b55c0f4cc62bf0cb3303f9bed3ea2acba65c2512406c90e09 Apr 22 18:20:52.855282 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:20:52.855260 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8bb1f9_6a08_4ee4_a71a_28dee8bb54ba.slice/crio-8f2542ffc6f9a5476ff2ff8ea97be87ec682ac1614ce4f99436abd0b34f7dd98 WatchSource:0}: Error finding container 8f2542ffc6f9a5476ff2ff8ea97be87ec682ac1614ce4f99436abd0b34f7dd98: Status 404 returned error can't find the container with id 8f2542ffc6f9a5476ff2ff8ea97be87ec682ac1614ce4f99436abd0b34f7dd98 Apr 22 18:20:52.915311 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:52.915189 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:52.915388 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.915329 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:52.915388 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.915348 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:52.915388 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.915358 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:52.915473 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:52.915401 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:53.915388252 +0000 UTC m=+4.222979545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:53.336027 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.335984 2551 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:51 +0000 UTC" deadline="2028-02-07 18:23:59.239561774 +0000 UTC" Apr 22 18:20:53.336027 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.336025 2551 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15744h3m5.903540827s" Apr 22 18:20:53.357751 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.357717 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdj" event={"ID":"3c510803-84b3-405a-93b2-ef05315ce43a","Type":"ContainerStarted","Data":"9dca38a3a792a2a8165c0c587fd525048cf368fde5b17be7a91342e9af0f5327"} Apr 22 18:20:53.362096 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.362060 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z9whf" event={"ID":"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba","Type":"ContainerStarted","Data":"8f2542ffc6f9a5476ff2ff8ea97be87ec682ac1614ce4f99436abd0b34f7dd98"} Apr 22 18:20:53.363831 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.363805 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkl7j" event={"ID":"9e3da756-d433-4424-94e8-89918c5586cd","Type":"ContainerStarted","Data":"42d9a1dd5d81268b55c0f4cc62bf0cb3303f9bed3ea2acba65c2512406c90e09"} Apr 22 18:20:53.367577 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.367536 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"2e4dc2e308bd60571308c2186bf0815a63f6f67c351e6c7fd187779fd3bc8745"} Apr 22 18:20:53.371136 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.371114 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" event={"ID":"bfcb0720-c589-4598-b75b-3f5d1ec714d3","Type":"ContainerStarted","Data":"8c5d85acfd8124f237a63fe0fc34c4f0d5b8371ae28aacd686c5ecf52d209984"} Apr 22 18:20:53.375031 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.375003 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-49bjb" event={"ID":"c818e9ac-a0e9-433e-9277-17075adf6903","Type":"ContainerStarted","Data":"0f1017ba281d1f8f38e81941256803aa66ba2a86c1f96d58d0924cd49c9194c3"} Apr 22 18:20:53.379300 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.379271 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerStarted","Data":"64349f51ba3d7428dcba4e5cc0912873e01b844a37f9dd3772034929637546f3"} Apr 22 18:20:53.382200 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.382175 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" event={"ID":"34174f385361dd3a899cc07554d23095","Type":"ContainerStarted","Data":"0efcbbed98bfeb472e386fc3ddf0f008b562e07802a41c531d2a0e8152e7899b"} Apr 22 18:20:53.384132 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.384109 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vs5cv" event={"ID":"5ebef991-9163-4261-ab37-3aa25f981e43","Type":"ContainerStarted","Data":"ad30f3f851a4dcd6c83612f34359264184f063aad811853b27bb157a01f7efe4"} Apr 22 18:20:53.385961 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.385941 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" event={"ID":"1dc684e4-5e8a-4c53-8f22-806b580c14f9","Type":"ContainerStarted","Data":"7df049e01b8e6f2288cac75a5624e28e0caafbf6d75ed72a1670e7caf9d4b7c6"} Apr 22 18:20:53.824019 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.823981 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:53.824199 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.824157 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.824263 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.824220 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:20:55.824201597 +0000 UTC m=+6.131792890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:53.925335 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:53.924688 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:53.925335 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.924869 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:53.925335 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.924890 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:53.925335 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.924903 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:53.925335 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:53.924962 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:55.924942605 +0000 UTC m=+6.232533915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:54.344856 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:54.344822 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:54.345377 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:54.344964 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:20:54.345377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:54.345049 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:54.345377 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:54.345122 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:20:54.410353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:54.409013 2551 generic.go:358] "Generic (PLEG): container finished" podID="4cb2a87aae6f4cdbc0ed88f11045cd61" containerID="4dd8a03d8887013a995ef2203d24d6323555bbab1effe730dc0881267008b188" exitCode=0 Apr 22 18:20:54.410353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:54.409947 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" event={"ID":"4cb2a87aae6f4cdbc0ed88f11045cd61","Type":"ContainerDied","Data":"4dd8a03d8887013a995ef2203d24d6323555bbab1effe730dc0881267008b188"} Apr 22 18:20:54.425280 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:54.425006 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-88.ec2.internal" podStartSLOduration=3.424991344 podStartE2EDuration="3.424991344s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:53.399540609 +0000 UTC m=+3.707131953" watchObservedRunningTime="2026-04-22 18:20:54.424991344 +0000 UTC m=+4.732582663" Apr 22 18:20:55.426925 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:55.426232 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" event={"ID":"4cb2a87aae6f4cdbc0ed88f11045cd61","Type":"ContainerStarted","Data":"7cf7032e818599ef1beef7174ce24af179fc718bf65198a664f1610ee7823bdf"} Apr 22 18:20:55.838835 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:55.838799 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:55.838991 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.838936 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:55.839036 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.838998 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.838981591 +0000 UTC m=+10.146572931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:55.940146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:55.940107 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:55.940311 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.940297 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:55.940416 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.940315 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:55.940416 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.940328 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:55.940416 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:55.940388 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.940369697 +0000 UTC m=+10.247961008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:56.345116 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:56.342100 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:56.345116 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:56.342243 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:20:56.345116 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:56.344795 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:56.345116 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:56.344877 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:20:58.343777 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:58.343736 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:58.344221 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:58.343886 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:20:58.344391 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:58.344373 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:58.344483 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:58.344464 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:20:59.876208 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:59.876170 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:20:59.876664 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.876331 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:59.876664 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.876411 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:07.876389226 +0000 UTC m=+18.183980522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:59.977095 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:20:59.976536 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:20:59.977095 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.976701 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:59.977095 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.976720 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:59.977095 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.976732 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:59.977095 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:20:59.976774 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:07.976762207 +0000 UTC m=+18.284353500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:00.342968 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:00.342936 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:00.343142 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:00.343050 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:00.343476 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:00.343458 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:00.343602 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:00.343582 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:02.341961 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:02.341923 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:02.342343 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:02.342033 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:02.342343 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:02.342090 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:02.342343 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:02.342178 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:04.341863 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:04.341825 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:04.342207 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:04.341825 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:04.342207 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:04.341953 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:04.342207 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:04.342006 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:06.341600 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:06.341568 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:06.342025 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:06.341568 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:06.342025 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:06.341710 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:06.342025 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:06.341729 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:07.928449 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:07.928410 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:07.928905 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:07.928590 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:07.928905 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:07.928658 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:23.928642255 +0000 UTC m=+34.236233553 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:08.029303 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:08.029269 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:08.029609 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.029403 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:08.029609 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.029421 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:08.029609 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.029430 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:08.029609 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.029488 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:24.029469358 +0000 UTC m=+34.337060656 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:08.342076 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:08.342042 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:08.342076 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:08.342071 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:08.342318 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.342230 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:08.342397 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:08.342366 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:10.342907 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.342768 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:10.343410 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:10.342969 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:10.343410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.342832 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:10.343410 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:10.343020 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:10.452526 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.452480 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vs5cv" event={"ID":"5ebef991-9163-4261-ab37-3aa25f981e43","Type":"ContainerStarted","Data":"3d78505d8a0a540dfdabbe3202b5cb0b79a6a159a3dd2989850ae00e875ceef5"} Apr 22 18:21:10.454135 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.454105 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" event={"ID":"1dc684e4-5e8a-4c53-8f22-806b580c14f9","Type":"ContainerStarted","Data":"11e5eeaf9d3e112d402a25e0badc1234c1cd5bb2e46529b685afd3cc26f53b03"} Apr 22 18:21:10.455613 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.455579 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdj" event={"ID":"3c510803-84b3-405a-93b2-ef05315ce43a","Type":"ContainerStarted","Data":"00d1a936226525ab87b7770538075d0f9449cce0583047645ffa666d63f2b396"} Apr 22 18:21:10.456910 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.456856 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z9whf" event={"ID":"2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba","Type":"ContainerStarted","Data":"cc27cda2915daa55d3c916108d91477d898a497514dbd4541a134a7e5c14efe9"} Apr 22 18:21:10.458340 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.458309 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"5172c30f6778a70d7f8c03caed9baaca90a87e4d59028960b45e2d1da4baec67"} Apr 22 18:21:10.458340 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.458334 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"a704aba35d39f8249308eb8164621e5e4c24869b889f70af199eca801f1ff2c5"} Apr 22 18:21:10.459570 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.459531 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" event={"ID":"bfcb0720-c589-4598-b75b-3f5d1ec714d3","Type":"ContainerStarted","Data":"a8b403ffc1f9366e8dc86dbf8b065742e1136eda2a98e8b2ddb5706511a518c6"} Apr 22 18:21:10.472409 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.472386 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-49bjb" event={"ID":"c818e9ac-a0e9-433e-9277-17075adf6903","Type":"ContainerStarted","Data":"66179e41d954de5ecbe495ac43a7789f785ab1f04ac622faeb05478f612ff504"} Apr 22 18:21:10.473450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.473432 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerStarted","Data":"75a7bdbf3f5a15d6856ceefd01f165dc4ab18dd1c35f3de381c48b4181c6e350"} Apr 22 18:21:10.483862 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.483694 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-88.ec2.internal" podStartSLOduration=19.483683926 podStartE2EDuration="19.483683926s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:55.441256951 +0000 UTC m=+5.748848267" watchObservedRunningTime="2026-04-22 18:21:10.483683926 +0000 UTC m=+20.791275242" Apr 22 18:21:10.523730 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.523684 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vs5cv" podStartSLOduration=3.463224162 podStartE2EDuration="20.523668475s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.852124909 +0000 UTC m=+3.159716214" lastFinishedPulling="2026-04-22 18:21:09.91256922 +0000 UTC m=+20.220160527" observedRunningTime="2026-04-22 18:21:10.483942858 +0000 UTC m=+20.791534153" watchObservedRunningTime="2026-04-22 18:21:10.523668475 +0000 UTC m=+20.831259806" Apr 22 18:21:10.523869 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.523844 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dskdj" podStartSLOduration=3.372173205 podStartE2EDuration="20.52383919s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.845100142 +0000 UTC m=+3.152691437" lastFinishedPulling="2026-04-22 18:21:09.996766127 +0000 UTC m=+20.304357422" observedRunningTime="2026-04-22 18:21:10.521370175 +0000 UTC m=+20.828961492" watchObservedRunningTime="2026-04-22 18:21:10.52383919 +0000 UTC m=+20.831430504" Apr 22 18:21:10.595457 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.595402 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z9whf" podStartSLOduration=3.50488005 podStartE2EDuration="20.595381907s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.85884907 +0000 UTC m=+3.166440376" lastFinishedPulling="2026-04-22 18:21:09.949350935 +0000 UTC m=+20.256942233" observedRunningTime="2026-04-22 18:21:10.593179509 +0000 UTC m=+20.900770825" watchObservedRunningTime="2026-04-22 18:21:10.595381907 +0000 UTC m=+20.902973223" Apr 22 18:21:10.626470 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:10.626422 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nrnpx" podStartSLOduration=7.941362899 podStartE2EDuration="20.626407759s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.852636804 +0000 UTC m=+3.160228098" lastFinishedPulling="2026-04-22 18:21:05.537681662 +0000 UTC m=+15.845272958" observedRunningTime="2026-04-22 18:21:10.625727809 +0000 UTC m=+20.933319124" watchObservedRunningTime="2026-04-22 18:21:10.626407759 +0000 UTC m=+20.933999074" Apr 22 18:21:11.362065 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.362042 2551 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:21:11.477077 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.477048 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" event={"ID":"1dc684e4-5e8a-4c53-8f22-806b580c14f9","Type":"ContainerStarted","Data":"799c8e491c1fb921400c3ecf4274b7b4b05bd9087d7cdb77d3d23e5c83eab62c"} Apr 22 18:21:11.478314 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.478276 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkl7j" event={"ID":"9e3da756-d433-4424-94e8-89918c5586cd","Type":"ContainerStarted","Data":"90ded1473325b489e4c9ddbc8d07553608ba6a3e91a9a9fd062ff38e0ea1c7d5"} Apr 22 18:21:11.480590 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480573 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/0.log" Apr 22 18:21:11.480838 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480820 2551 generic.go:358] "Generic (PLEG): container finished" podID="9ee7d206-a285-4a35-a05b-5ea4d067ba50" containerID="5172c30f6778a70d7f8c03caed9baaca90a87e4d59028960b45e2d1da4baec67" exitCode=1 Apr 22 18:21:11.480898 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480879 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerDied","Data":"5172c30f6778a70d7f8c03caed9baaca90a87e4d59028960b45e2d1da4baec67"} Apr 22 18:21:11.480898 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480896 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"a196f3239acc32fb2a12637bd1e23984ee5a4b7a2c9cd2a1758c88978413f79d"} Apr 22 18:21:11.480976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480906 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"66958fc531ed47ac2a2fccf6d6ddee2defa5bbf9e40c5eb07c10e9deb0e765f9"} Apr 22 18:21:11.480976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480915 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"e1fade10342f8d3cf5abdc6153ea0e34e2cf931ef7fec6a7dbd3f92a8cd597ff"} Apr 22 18:21:11.480976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.480931 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"75952cde334a180e02f94e8ace66ce2b1cef0a88d88f0b47bfc58564e8fbe68f"} Apr 22 18:21:11.482020 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.482000 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="75a7bdbf3f5a15d6856ceefd01f165dc4ab18dd1c35f3de381c48b4181c6e350" exitCode=0 Apr 22 18:21:11.482150 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.482126 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"75a7bdbf3f5a15d6856ceefd01f165dc4ab18dd1c35f3de381c48b4181c6e350"} Apr 22 18:21:11.524021 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.523982 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-49bjb" podStartSLOduration=4.425773562 podStartE2EDuration="21.523971488s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.851201341 +0000 UTC m=+3.158792637" lastFinishedPulling="2026-04-22 18:21:09.949399269 +0000 UTC m=+20.256990563" observedRunningTime="2026-04-22 18:21:10.650916347 +0000 UTC m=+20.958507665" watchObservedRunningTime="2026-04-22 18:21:11.523971488 +0000 UTC m=+21.831562803" Apr 22 18:21:11.567172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:11.566984 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fkl7j" podStartSLOduration=4.474380398 podStartE2EDuration="21.566973489s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.856890181 +0000 UTC m=+3.164481487" lastFinishedPulling="2026-04-22 18:21:09.949483279 +0000 UTC m=+20.257074578" observedRunningTime="2026-04-22 18:21:11.523811318 +0000 UTC m=+21.831402662" watchObservedRunningTime="2026-04-22 18:21:11.566973489 +0000 UTC m=+21.874564800" Apr 22 18:21:12.276138 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.276032 2551 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:21:11.362060934Z","UUID":"ed998bfd-7755-44b1-a6d9-5b7621f33ed8","Handler":null,"Name":"","Endpoint":""} Apr 22 18:21:12.278058 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.278032 2551 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:21:12.278162 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.278069 2551 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:21:12.341565 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.341534 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:12.341696 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:12.341674 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:12.341764 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.341709 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:12.341816 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:12.341780 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:12.486376 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:12.486347 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" event={"ID":"1dc684e4-5e8a-4c53-8f22-806b580c14f9","Type":"ContainerStarted","Data":"57bfe81b5f462ebfb093f10aa3ae9bb456915e506c89d807b17addc8c72c3aca"} Apr 22 18:21:13.491653 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:13.491625 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/0.log" Apr 22 18:21:13.492170 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:13.491991 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"64f73bdff17fdada56ba7a0f9b5486ac87189f0f32605a536096d6982e7553a2"} Apr 22 18:21:13.520533 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:13.520476 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b9tln" podStartSLOduration=4.022329033 podStartE2EDuration="23.520459109s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.848011762 +0000 UTC m=+3.155603056" lastFinishedPulling="2026-04-22 18:21:12.34614184 +0000 UTC m=+22.653733132" observedRunningTime="2026-04-22 18:21:13.520052092 +0000 UTC m=+23.827643408" watchObservedRunningTime="2026-04-22 18:21:13.520459109 +0000 UTC m=+23.828050424" Apr 22 18:21:13.918263 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:13.918234 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:21:13.918889 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:13.918866 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:21:14.342108 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:14.342079 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:14.342258 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:14.342128 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:14.342258 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:14.342210 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:14.342386 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:14.342353 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:14.493480 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:14.493450 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-49bjb_c818e9ac-a0e9-433e-9277-17075adf6903/dns-node-resolver/0.log" Apr 22 18:21:14.494075 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:14.493791 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:21:14.494240 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:14.494225 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z9whf" Apr 22 18:21:15.499413 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.499190 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/0.log" Apr 22 18:21:15.500000 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.499803 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"be70263b0724ce8e67f71c0fded7c789c2236c48147698b1f950d31479d39f20"} Apr 22 18:21:15.500280 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.500257 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:15.500378 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.500370 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:15.500423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.500402 2551 scope.go:117] "RemoveContainer" containerID="5172c30f6778a70d7f8c03caed9baaca90a87e4d59028960b45e2d1da4baec67" Apr 22 18:21:15.500467 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.500454 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:15.517007 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.516984 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:15.517112 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.517047 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:15.616518 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:15.616481 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vs5cv_5ebef991-9163-4261-ab37-3aa25f981e43/node-ca/0.log" Apr 22 18:21:16.342107 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:16.342072 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:16.342264 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:16.342204 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:16.342317 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:16.342285 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:16.342422 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:16.342399 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:17.506454 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:17.506427 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/0.log" Apr 22 18:21:17.506953 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:17.506777 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" event={"ID":"9ee7d206-a285-4a35-a05b-5ea4d067ba50","Type":"ContainerStarted","Data":"bb24d55d8bc945e72ab5f8fac9f12acd940d5e5b298d93d2879614b7830b4c10"} Apr 22 18:21:17.508361 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:17.508336 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="05529d63ebfe5645cd88e168ebf314043d4c3c9020e3af02f072695a8b573d49" exitCode=0 Apr 22 18:21:17.508478 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:17.508381 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"05529d63ebfe5645cd88e168ebf314043d4c3c9020e3af02f072695a8b573d49"} Apr 22 18:21:17.538988 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:17.538949 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" podStartSLOduration=10.207395719 podStartE2EDuration="27.538936549s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.855382441 +0000 UTC m=+3.162973737" lastFinishedPulling="2026-04-22 18:21:10.186923273 +0000 UTC m=+20.494514567" observedRunningTime="2026-04-22 18:21:17.537848541 +0000 UTC m=+27.845439856" watchObservedRunningTime="2026-04-22 18:21:17.538936549 +0000 UTC m=+27.846527864" Apr 22 18:21:18.342038 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:18.341996 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:18.342193 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:18.342102 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:18.342193 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:18.342165 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:18.342308 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:18.342272 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:18.511531 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:18.511451 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="1ef1e8fd57df070be676de1f995c74964d69e134d7a93f376834de535d4095f1" exitCode=0 Apr 22 18:21:18.511937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:18.511541 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"1ef1e8fd57df070be676de1f995c74964d69e134d7a93f376834de535d4095f1"} Apr 22 18:21:19.515225 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:19.515139 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="bbe4dffede196242e805ee8c5aa7a8d9451f457241339cf740bc33f0dda883d6" exitCode=0 Apr 22 18:21:19.515225 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:19.515197 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"bbe4dffede196242e805ee8c5aa7a8d9451f457241339cf740bc33f0dda883d6"} Apr 22 18:21:20.343825 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:20.343188 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:20.343825 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:20.343314 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:20.343825 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:20.343690 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:20.343825 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:20.343782 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:22.342473 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:22.342277 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:22.343036 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:22.342329 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:22.343036 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:22.342583 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:22.343036 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:22.342658 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:23.952738 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:23.952699 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:23.953145 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:23.952862 2551 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:23.953145 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:23.952928 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs podName:f76d2d65-0522-4be1-946b-e6c684435f2d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:55.952913436 +0000 UTC m=+66.260504729 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs") pod "network-metrics-daemon-vvfq2" (UID: "f76d2d65-0522-4be1-946b-e6c684435f2d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:24.053720 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:24.053683 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:24.053905 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.053820 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:24.053905 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.053843 2551 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:24.053905 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.053857 2551 projected.go:194] Error preparing data for projected volume kube-api-access-gl9n6 for pod openshift-network-diagnostics/network-check-target-2h4lz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:24.054040 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.053922 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6 podName:851a7f2e-934f-4bc2-a805-c7e37bdc4757 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:56.053905723 +0000 UTC m=+66.361497018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gl9n6" (UniqueName: "kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6") pod "network-check-target-2h4lz" (UID: "851a7f2e-934f-4bc2-a805-c7e37bdc4757") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:24.341864 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:24.341832 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:24.342038 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:24.341872 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:24.342038 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.341966 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:24.342133 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:24.342096 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:25.529847 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:25.529812 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerStarted","Data":"c35a05aae9bc4207858b45f29c184399bfaef9612e804cf9e48a764de7bea6df"} Apr 22 18:21:26.341756 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:26.341720 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:26.341901 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:26.341842 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:26.341959 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:26.341899 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:26.342016 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:26.341998 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:26.533598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:26.533543 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="c35a05aae9bc4207858b45f29c184399bfaef9612e804cf9e48a764de7bea6df" exitCode=0 Apr 22 18:21:26.533598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:26.533586 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"c35a05aae9bc4207858b45f29c184399bfaef9612e804cf9e48a764de7bea6df"} Apr 22 18:21:27.537526 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:27.537485 2551 generic.go:358] "Generic (PLEG): container finished" podID="c9321020-7781-4341-88ff-61119a578087" containerID="ab0f8ae23b7a0d14d2e5555eebd3c1f5ebfb799474db52447e6aa174c30df3d6" exitCode=0 Apr 22 18:21:27.537912 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:27.537531 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerDied","Data":"ab0f8ae23b7a0d14d2e5555eebd3c1f5ebfb799474db52447e6aa174c30df3d6"} Apr 22 18:21:28.341994 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:28.341957 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:28.342164 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:28.342064 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:28.342164 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:28.342122 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:28.342241 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:28.342211 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:28.542355 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:28.542319 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" event={"ID":"c9321020-7781-4341-88ff-61119a578087","Type":"ContainerStarted","Data":"b0b0a6e6cf7016bda226ff976b143ad955ed25e55377d72575680cb17da73c21"} Apr 22 18:21:28.564010 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:28.563965 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rz6nj" podStartSLOduration=6.119383633 podStartE2EDuration="38.563953059s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:20:52.848326463 +0000 UTC m=+3.155917756" lastFinishedPulling="2026-04-22 18:21:25.29289589 +0000 UTC m=+35.600487182" observedRunningTime="2026-04-22 18:21:28.562791387 +0000 UTC m=+38.870382681" watchObservedRunningTime="2026-04-22 18:21:28.563953059 +0000 UTC m=+38.871544374" Apr 22 18:21:30.342123 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:30.342086 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:30.342703 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:30.342172 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:30.342703 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:30.342097 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:30.342703 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:30.342236 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:32.341353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:32.341322 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:32.341832 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:32.341325 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:32.341832 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:32.341419 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:32.341832 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:32.341508 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:34.342353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:34.342320 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:34.342949 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:34.342328 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:34.343151 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:34.343119 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:34.343249 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:34.342960 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:35.296943 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:35.296703 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2h4lz"] Apr 22 18:21:35.297185 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:35.297166 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:35.297304 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:35.297274 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:35.297434 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:35.297398 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vvfq2"] Apr 22 18:21:35.297497 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:35.297485 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:35.297636 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:35.297612 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:37.341389 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:37.341353 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:37.342129 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:37.341366 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:37.342129 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:37.341477 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:37.342129 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:37.341527 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:39.341983 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:39.341950 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:39.342524 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:39.341959 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:39.342524 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:39.342043 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h4lz" podUID="851a7f2e-934f-4bc2-a805-c7e37bdc4757" Apr 22 18:21:39.342524 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:39.342182 2551 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vvfq2" podUID="f76d2d65-0522-4be1-946b-e6c684435f2d" Apr 22 18:21:40.032617 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.032583 2551 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-88.ec2.internal" event="NodeReady" Apr 22 18:21:40.032775 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.032725 2551 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:21:40.078445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.078366 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t4nt7"] Apr 22 18:21:40.115829 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.115806 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8gbsn"] Apr 22 18:21:40.115979 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.115962 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.119010 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.118984 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-69x95\"" Apr 22 18:21:40.119010 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.118998 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:21:40.119203 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.119122 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:21:40.133937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.133915 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4nt7"] Apr 22 18:21:40.133937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.133936 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gbsn"] Apr 22 18:21:40.134066 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.133946 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-brzh2"] Apr 22 18:21:40.134066 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.134053 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.138360 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.138292 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:21:40.138498 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.138303 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:21:40.141384 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.140097 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:21:40.141384 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.140229 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-592td\"" Apr 22 18:21:40.141384 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.140970 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:21:40.158260 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.158237 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brzh2"] Apr 22 18:21:40.158353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.158322 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.160655 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.160637 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:21:40.160655 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.160652 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwl6t\"" Apr 22 18:21:40.160917 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.160900 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:21:40.161348 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.161335 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:21:40.167134 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167114 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9cce88c0-9408-413b-9cd7-e5fc219d25a9-metrics-tls\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.167219 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167149 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cdb9284-ca5f-40ac-ab03-4001937629de-cert\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.167219 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167183 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23453ed9-b368-42e8-85da-2827f42c37c5-crio-socket\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.167320 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167222 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cce88c0-9408-413b-9cd7-e5fc219d25a9-tmp-dir\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.167320 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167263 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8s6r\" (UniqueName: \"kubernetes.io/projected/9cce88c0-9408-413b-9cd7-e5fc219d25a9-kube-api-access-x8s6r\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.167320 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167295 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23453ed9-b368-42e8-85da-2827f42c37c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.167423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167324 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23453ed9-b368-42e8-85da-2827f42c37c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.167423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167351 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cce88c0-9408-413b-9cd7-e5fc219d25a9-config-volume\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.167423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167388 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23453ed9-b368-42e8-85da-2827f42c37c5-data-volume\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.167423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167408 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5j5\" (UniqueName: \"kubernetes.io/projected/23453ed9-b368-42e8-85da-2827f42c37c5-kube-api-access-zc5j5\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.167571 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.167427 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkps6\" (UniqueName: \"kubernetes.io/projected/7cdb9284-ca5f-40ac-ab03-4001937629de-kube-api-access-dkps6\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.267822 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267788 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23453ed9-b368-42e8-85da-2827f42c37c5-data-volume\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.267951 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267830 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5j5\" (UniqueName: \"kubernetes.io/projected/23453ed9-b368-42e8-85da-2827f42c37c5-kube-api-access-zc5j5\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.267951 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267857 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkps6\" (UniqueName: \"kubernetes.io/projected/7cdb9284-ca5f-40ac-ab03-4001937629de-kube-api-access-dkps6\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.267951 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267884 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9cce88c0-9408-413b-9cd7-e5fc219d25a9-metrics-tls\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.267951 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267918 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cdb9284-ca5f-40ac-ab03-4001937629de-cert\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.268180 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267952 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23453ed9-b368-42e8-85da-2827f42c37c5-crio-socket\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268180 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.267991 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cce88c0-9408-413b-9cd7-e5fc219d25a9-tmp-dir\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.268180 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268016 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8s6r\" (UniqueName: \"kubernetes.io/projected/9cce88c0-9408-413b-9cd7-e5fc219d25a9-kube-api-access-x8s6r\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.268180 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268145 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23453ed9-b368-42e8-85da-2827f42c37c5-data-volume\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268192 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23453ed9-b368-42e8-85da-2827f42c37c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268230 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23453ed9-b368-42e8-85da-2827f42c37c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268255 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cce88c0-9408-413b-9cd7-e5fc219d25a9-config-volume\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.268377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268258 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23453ed9-b368-42e8-85da-2827f42c37c5-crio-socket\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268360 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cce88c0-9408-413b-9cd7-e5fc219d25a9-tmp-dir\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.268786 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268767 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23453ed9-b368-42e8-85da-2827f42c37c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.268786 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.268780 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cce88c0-9408-413b-9cd7-e5fc219d25a9-config-volume\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.271989 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.271971 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23453ed9-b368-42e8-85da-2827f42c37c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.272081 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.272009 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9cce88c0-9408-413b-9cd7-e5fc219d25a9-metrics-tls\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.272081 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.272049 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cdb9284-ca5f-40ac-ab03-4001937629de-cert\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.295640 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.295612 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5j5\" (UniqueName: \"kubernetes.io/projected/23453ed9-b368-42e8-85da-2827f42c37c5-kube-api-access-zc5j5\") pod \"insights-runtime-extractor-8gbsn\" (UID: \"23453ed9-b368-42e8-85da-2827f42c37c5\") " pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.295983 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.295966 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkps6\" (UniqueName: \"kubernetes.io/projected/7cdb9284-ca5f-40ac-ab03-4001937629de-kube-api-access-dkps6\") pod \"ingress-canary-brzh2\" (UID: \"7cdb9284-ca5f-40ac-ab03-4001937629de\") " pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.306229 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.306209 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8s6r\" (UniqueName: \"kubernetes.io/projected/9cce88c0-9408-413b-9cd7-e5fc219d25a9-kube-api-access-x8s6r\") pod \"dns-default-t4nt7\" (UID: \"9cce88c0-9408-413b-9cd7-e5fc219d25a9\") " pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.426747 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.426723 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:40.444749 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.444727 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gbsn" Apr 22 18:21:40.466387 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.466354 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brzh2" Apr 22 18:21:40.601306 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.601277 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brzh2"] Apr 22 18:21:40.604943 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.604921 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4nt7"] Apr 22 18:21:40.606572 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:40.606527 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cdb9284_ca5f_40ac_ab03_4001937629de.slice/crio-62b0a47cd341cf0c928a748396e218ac3a76524b98677bd3885b5661b30f96d6 WatchSource:0}: Error finding container 62b0a47cd341cf0c928a748396e218ac3a76524b98677bd3885b5661b30f96d6: Status 404 returned error can't find the container with id 62b0a47cd341cf0c928a748396e218ac3a76524b98677bd3885b5661b30f96d6 Apr 22 18:21:40.607299 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:40.607280 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gbsn"] Apr 22 18:21:40.608576 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:40.608476 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cce88c0_9408_413b_9cd7_e5fc219d25a9.slice/crio-ab4c8a8df450f35968260865598dd3a1426defcf9635aea7058d94cf1cce665a WatchSource:0}: Error finding container ab4c8a8df450f35968260865598dd3a1426defcf9635aea7058d94cf1cce665a: Status 404 returned error can't find the container with id ab4c8a8df450f35968260865598dd3a1426defcf9635aea7058d94cf1cce665a Apr 22 18:21:40.612257 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:40.612137 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23453ed9_b368_42e8_85da_2827f42c37c5.slice/crio-cd36f943ff74a85f44a83375673228e0cf2f6be8fe152918c0a11f891b5989a2 WatchSource:0}: Error finding container cd36f943ff74a85f44a83375673228e0cf2f6be8fe152918c0a11f891b5989a2: Status 404 returned error can't find the container with id cd36f943ff74a85f44a83375673228e0cf2f6be8fe152918c0a11f891b5989a2 Apr 22 18:21:41.342327 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.342149 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:41.342629 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.342174 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:41.345744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.345709 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zkrzn\"" Apr 22 18:21:41.345744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.345736 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:41.345744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.345753 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:41.345948 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.345848 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bn55n\"" Apr 22 18:21:41.345948 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.345877 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:41.567968 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.567929 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4nt7" event={"ID":"9cce88c0-9408-413b-9cd7-e5fc219d25a9","Type":"ContainerStarted","Data":"ab4c8a8df450f35968260865598dd3a1426defcf9635aea7058d94cf1cce665a"} Apr 22 18:21:41.569083 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.569044 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brzh2" event={"ID":"7cdb9284-ca5f-40ac-ab03-4001937629de","Type":"ContainerStarted","Data":"62b0a47cd341cf0c928a748396e218ac3a76524b98677bd3885b5661b30f96d6"} Apr 22 18:21:41.570359 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.570334 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gbsn" event={"ID":"23453ed9-b368-42e8-85da-2827f42c37c5","Type":"ContainerStarted","Data":"d0d12be8c5898d9da22856fb46acb821d95d2988e6267e8fd48e4e69fe7f8738"} Apr 22 18:21:41.570451 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:41.570367 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gbsn" event={"ID":"23453ed9-b368-42e8-85da-2827f42c37c5","Type":"ContainerStarted","Data":"cd36f943ff74a85f44a83375673228e0cf2f6be8fe152918c0a11f891b5989a2"} Apr 22 18:21:43.578825 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.578782 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brzh2" event={"ID":"7cdb9284-ca5f-40ac-ab03-4001937629de","Type":"ContainerStarted","Data":"ff581124efc5c6040b98288bbeb839d35f86321827e6ae1bf3d938a2bac2cfaf"} Apr 22 18:21:43.580773 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.580738 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gbsn" event={"ID":"23453ed9-b368-42e8-85da-2827f42c37c5","Type":"ContainerStarted","Data":"00deef5914742f9e4073ab9901d7d7c8020edc51f5ca84f87f633bf1cbaf54cc"} Apr 22 18:21:43.582345 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.582324 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4nt7" event={"ID":"9cce88c0-9408-413b-9cd7-e5fc219d25a9","Type":"ContainerStarted","Data":"5fa8cb43089cc6fa599232815a917c785d7eb3166b14e34c70b83826625d27b4"} Apr 22 18:21:43.582345 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.582348 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4nt7" event={"ID":"9cce88c0-9408-413b-9cd7-e5fc219d25a9","Type":"ContainerStarted","Data":"a33d807769071f3a43b84150bab2be2b1ffe516d61e712bacfe9f771c0f75dfe"} Apr 22 18:21:43.582505 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.582435 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:43.599421 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.599374 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-brzh2" podStartSLOduration=1.306897182 podStartE2EDuration="3.599358039s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:40.610875962 +0000 UTC m=+50.918467257" lastFinishedPulling="2026-04-22 18:21:42.903336805 +0000 UTC m=+53.210928114" observedRunningTime="2026-04-22 18:21:43.598320079 +0000 UTC m=+53.905911393" watchObservedRunningTime="2026-04-22 18:21:43.599358039 +0000 UTC m=+53.906949356" Apr 22 18:21:43.615719 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.615140 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t4nt7" podStartSLOduration=1.327207572 podStartE2EDuration="3.615127052s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:40.61190366 +0000 UTC m=+50.919494957" lastFinishedPulling="2026-04-22 18:21:42.899823143 +0000 UTC m=+53.207414437" observedRunningTime="2026-04-22 18:21:43.614350547 +0000 UTC m=+53.921941877" watchObservedRunningTime="2026-04-22 18:21:43.615127052 +0000 UTC m=+53.922718368" Apr 22 18:21:43.625791 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.625766 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b"] Apr 22 18:21:43.637860 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.637838 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:43.639447 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.639427 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b"] Apr 22 18:21:43.641578 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.640316 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:21:43.641578 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.640611 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-pggqk\"" Apr 22 18:21:43.690247 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.690222 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e62322ef-140c-4f4a-b425-1efdef4c09e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9qx2b\" (UID: \"e62322ef-140c-4f4a-b425-1efdef4c09e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:43.790865 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.790836 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e62322ef-140c-4f4a-b425-1efdef4c09e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9qx2b\" (UID: \"e62322ef-140c-4f4a-b425-1efdef4c09e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:43.795084 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.794939 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e62322ef-140c-4f4a-b425-1efdef4c09e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9qx2b\" (UID: \"e62322ef-140c-4f4a-b425-1efdef4c09e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:43.885062 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.885031 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-84r4n"] Apr 22 18:21:43.897384 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.897359 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-84r4n"] Apr 22 18:21:43.897496 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.897464 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:21:43.899702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.899669 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:21:43.899702 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.899684 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:21:43.899859 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.899771 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-fv6c7\"" Apr 22 18:21:43.949145 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.949112 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:43.992388 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:43.992359 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz44\" (UniqueName: \"kubernetes.io/projected/95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74-kube-api-access-mbz44\") pod \"downloads-6bcc868b7-84r4n\" (UID: \"95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74\") " pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:21:44.092983 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.092948 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz44\" (UniqueName: \"kubernetes.io/projected/95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74-kube-api-access-mbz44\") pod \"downloads-6bcc868b7-84r4n\" (UID: \"95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74\") " pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:21:44.103606 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.103578 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz44\" (UniqueName: \"kubernetes.io/projected/95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74-kube-api-access-mbz44\") pod \"downloads-6bcc868b7-84r4n\" (UID: \"95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74\") " pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:21:44.207893 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.207768 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:21:44.560093 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.560060 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b"] Apr 22 18:21:44.564848 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:44.564820 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62322ef_140c_4f4a_b425_1efdef4c09e9.slice/crio-f61a2abbefc82e2ce35d03d46e94708a6820f80ccfd8a1767eb2b5018af5f61d WatchSource:0}: Error finding container f61a2abbefc82e2ce35d03d46e94708a6820f80ccfd8a1767eb2b5018af5f61d: Status 404 returned error can't find the container with id f61a2abbefc82e2ce35d03d46e94708a6820f80ccfd8a1767eb2b5018af5f61d Apr 22 18:21:44.573272 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.573247 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-84r4n"] Apr 22 18:21:44.586744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:44.586711 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" event={"ID":"e62322ef-140c-4f4a-b425-1efdef4c09e9","Type":"ContainerStarted","Data":"f61a2abbefc82e2ce35d03d46e94708a6820f80ccfd8a1767eb2b5018af5f61d"} Apr 22 18:21:44.642611 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:44.642573 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e39a23_bd8f_4cdd_8bf8_f2edb8f1fb74.slice/crio-4187042a1a00fbbe5668059daf5e362698e99733c250e37c8ccfebc5b135b32d WatchSource:0}: Error finding container 4187042a1a00fbbe5668059daf5e362698e99733c250e37c8ccfebc5b135b32d: Status 404 returned error can't find the container with id 4187042a1a00fbbe5668059daf5e362698e99733c250e37c8ccfebc5b135b32d Apr 22 18:21:45.591627 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:45.591586 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-84r4n" event={"ID":"95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74","Type":"ContainerStarted","Data":"4187042a1a00fbbe5668059daf5e362698e99733c250e37c8ccfebc5b135b32d"} Apr 22 18:21:45.593922 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:45.593894 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gbsn" event={"ID":"23453ed9-b368-42e8-85da-2827f42c37c5","Type":"ContainerStarted","Data":"c9c0005c90b506267f1e02449078a1af5e620e3fa8a5dc0bbc2d5049e1d9ed01"} Apr 22 18:21:45.612541 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:45.612469 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8gbsn" podStartSLOduration=1.682690484 podStartE2EDuration="5.612457169s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:40.722842883 +0000 UTC m=+51.030434181" lastFinishedPulling="2026-04-22 18:21:44.65260957 +0000 UTC m=+54.960200866" observedRunningTime="2026-04-22 18:21:45.611017647 +0000 UTC m=+55.918608962" watchObservedRunningTime="2026-04-22 18:21:45.612457169 +0000 UTC m=+55.920048484" Apr 22 18:21:46.597889 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:46.597851 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" event={"ID":"e62322ef-140c-4f4a-b425-1efdef4c09e9","Type":"ContainerStarted","Data":"5a6a96615d84783e88e6d3c5a9141a5b78880a76f7dedcaf5dc5a5d4ce6ae268"} Apr 22 18:21:46.598309 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:46.598185 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:46.602805 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:46.602781 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" Apr 22 18:21:46.612818 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:46.612708 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9qx2b" podStartSLOduration=1.92635131 podStartE2EDuration="3.612693854s" podCreationTimestamp="2026-04-22 18:21:43 +0000 UTC" firstStartedPulling="2026-04-22 18:21:44.5667509 +0000 UTC m=+54.874342193" lastFinishedPulling="2026-04-22 18:21:46.253093441 +0000 UTC m=+56.560684737" observedRunningTime="2026-04-22 18:21:46.612218009 +0000 UTC m=+56.919809348" watchObservedRunningTime="2026-04-22 18:21:46.612693854 +0000 UTC m=+56.920285167" Apr 22 18:21:47.689243 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.689209 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-24bwx"] Apr 22 18:21:47.713930 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.713901 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-24bwx"] Apr 22 18:21:47.714092 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.714025 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.716627 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.716431 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qp2rx\"" Apr 22 18:21:47.716627 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.716490 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:21:47.716627 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.716501 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:21:47.716627 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.716487 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:21:47.717036 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.717015 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:21:47.717159 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.717140 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:21:47.721541 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.721520 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.721662 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.721597 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b939c03-6181-4d6d-967f-165ff0a8e2e6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.721662 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.721641 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qkf\" (UniqueName: \"kubernetes.io/projected/4b939c03-6181-4d6d-967f-165ff0a8e2e6-kube-api-access-p5qkf\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.721751 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.721676 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.822419 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.822335 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b939c03-6181-4d6d-967f-165ff0a8e2e6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.822576 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.822425 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qkf\" (UniqueName: \"kubernetes.io/projected/4b939c03-6181-4d6d-967f-165ff0a8e2e6-kube-api-access-p5qkf\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.822576 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.822468 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.822576 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.822515 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.822865 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:47.822844 2551 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:21:47.822964 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:47.822915 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls podName:4b939c03-6181-4d6d-967f-165ff0a8e2e6 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:48.322893981 +0000 UTC m=+58.630485275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-24bwx" (UID: "4b939c03-6181-4d6d-967f-165ff0a8e2e6") : secret "prometheus-operator-tls" not found Apr 22 18:21:47.823840 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.823816 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b939c03-6181-4d6d-967f-165ff0a8e2e6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.825845 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.825824 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:47.832000 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:47.831974 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qkf\" (UniqueName: \"kubernetes.io/projected/4b939c03-6181-4d6d-967f-165ff0a8e2e6-kube-api-access-p5qkf\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:48.325968 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:48.325930 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:48.328977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:48.328947 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b939c03-6181-4d6d-967f-165ff0a8e2e6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-24bwx\" (UID: \"4b939c03-6181-4d6d-967f-165ff0a8e2e6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:48.524742 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:48.524710 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz6xk" Apr 22 18:21:48.625354 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:48.625328 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" Apr 22 18:21:48.747315 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:48.747285 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-24bwx"] Apr 22 18:21:48.750578 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:48.750538 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b939c03_6181_4d6d_967f_165ff0a8e2e6.slice/crio-fc88dc9000343e04b96d4dae3e9e718e8b7085cfe844863f3c69cc17fc1ea0ad WatchSource:0}: Error finding container fc88dc9000343e04b96d4dae3e9e718e8b7085cfe844863f3c69cc17fc1ea0ad: Status 404 returned error can't find the container with id fc88dc9000343e04b96d4dae3e9e718e8b7085cfe844863f3c69cc17fc1ea0ad Apr 22 18:21:49.607638 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:49.607594 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" event={"ID":"4b939c03-6181-4d6d-967f-165ff0a8e2e6","Type":"ContainerStarted","Data":"fc88dc9000343e04b96d4dae3e9e718e8b7085cfe844863f3c69cc17fc1ea0ad"} Apr 22 18:21:50.612080 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:50.612041 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" event={"ID":"4b939c03-6181-4d6d-967f-165ff0a8e2e6","Type":"ContainerStarted","Data":"5cc1e5c047b1b671de8eb53b766cb123fe392d2927e3e1305612d2e05cd0eb62"} Apr 22 18:21:50.612080 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:50.612081 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" event={"ID":"4b939c03-6181-4d6d-967f-165ff0a8e2e6","Type":"ContainerStarted","Data":"d2ba69c996e629784372894c16e09be8f9416b97dad31ab91dd71e8281648315"} Apr 22 18:21:50.630388 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:50.630343 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-24bwx" podStartSLOduration=2.314239694 podStartE2EDuration="3.630329947s" podCreationTimestamp="2026-04-22 18:21:47 +0000 UTC" firstStartedPulling="2026-04-22 18:21:48.752618719 +0000 UTC m=+59.060210012" lastFinishedPulling="2026-04-22 18:21:50.068708959 +0000 UTC m=+60.376300265" observedRunningTime="2026-04-22 18:21:50.629013186 +0000 UTC m=+60.936604501" watchObservedRunningTime="2026-04-22 18:21:50.630329947 +0000 UTC m=+60.937921291" Apr 22 18:21:52.056609 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.056580 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg"] Apr 22 18:21:52.059775 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.059760 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.061960 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.061943 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:21:52.062065 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.061968 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:21:52.062184 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.062170 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-hwgh9\"" Apr 22 18:21:52.071942 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.071893 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg"] Apr 22 18:21:52.106325 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.106286 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6n8s9"] Apr 22 18:21:52.109273 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.109257 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.111498 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.111477 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:21:52.111498 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.111489 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:21:52.111658 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.111480 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ghp22\"" Apr 22 18:21:52.111658 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.111588 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:21:52.118502 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.118483 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fxx9s"] Apr 22 18:21:52.121341 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.121328 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.123456 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.123439 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:21:52.123456 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.123448 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-c4gq8\"" Apr 22 18:21:52.123824 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.123809 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:21:52.124044 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.124027 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:21:52.135592 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.135569 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fxx9s"] Apr 22 18:21:52.159034 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159009 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159037 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247e6f19-cb23-47a1-9176-374e21a56b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.159146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159064 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159106 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfshm\" (UniqueName: \"kubernetes.io/projected/d76607bd-cdff-4883-9d9f-5cea32d17b2c-kube-api-access-qfshm\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159146 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159133 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l96s\" (UniqueName: \"kubernetes.io/projected/515611f7-a360-418d-b451-b9a1ae66f011-kube-api-access-6l96s\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159168 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-metrics-client-ca\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159211 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159269 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159307 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d225t\" (UniqueName: \"kubernetes.io/projected/247e6f19-cb23-47a1-9176-374e21a56b59-kube-api-access-d225t\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.159353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159330 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-wtmp\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159367 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159426 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159444 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-sys\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159460 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/515611f7-a360-418d-b451-b9a1ae66f011-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159477 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159504 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159571 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.159630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159627 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-root\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.160018 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.159675 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-textfile\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260108 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260079 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260278 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260115 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.260278 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260133 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-root\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260278 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260192 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-root\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260278 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260229 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-textfile\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260278 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260267 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.260543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260293 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247e6f19-cb23-47a1-9176-374e21a56b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.260543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260321 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260346 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfshm\" (UniqueName: \"kubernetes.io/projected/d76607bd-cdff-4883-9d9f-5cea32d17b2c-kube-api-access-qfshm\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260712 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:52.260648 2551 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:21:52.260712 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260674 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l96s\" (UniqueName: \"kubernetes.io/projected/515611f7-a360-418d-b451-b9a1ae66f011-kube-api-access-6l96s\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.260807 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:21:52.260721 2551 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls podName:d76607bd-cdff-4883-9d9f-5cea32d17b2c nodeName:}" failed. No retries permitted until 2026-04-22 18:21:52.760701259 +0000 UTC m=+63.068292555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls") pod "node-exporter-6n8s9" (UID: "d76607bd-cdff-4883-9d9f-5cea32d17b2c") : secret "node-exporter-tls" not found Apr 22 18:21:52.260807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260760 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-metrics-client-ca\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260771 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.260807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260791 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260830 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260891 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d225t\" (UniqueName: \"kubernetes.io/projected/247e6f19-cb23-47a1-9176-374e21a56b59-kube-api-access-d225t\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260917 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-wtmp\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260961 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260994 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.260999 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261037 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-sys\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261059 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247e6f19-cb23-47a1-9176-374e21a56b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261062 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/515611f7-a360-418d-b451-b9a1ae66f011-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261183 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261112 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261196 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-wtmp\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261382 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/515611f7-a360-418d-b451-b9a1ae66f011-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261441 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d76607bd-cdff-4883-9d9f-5cea32d17b2c-sys\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261529 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.261697 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261685 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d76607bd-cdff-4883-9d9f-5cea32d17b2c-metrics-client-ca\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.261933 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.261761 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-textfile\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.263659 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.263634 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.263962 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.263927 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.264138 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.264114 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/247e6f19-cb23-47a1-9176-374e21a56b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.264216 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.264203 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/515611f7-a360-418d-b451-b9a1ae66f011-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.264301 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.264284 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.272543 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.272516 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l96s\" (UniqueName: \"kubernetes.io/projected/515611f7-a360-418d-b451-b9a1ae66f011-kube-api-access-6l96s\") pod \"kube-state-metrics-69db897b98-fxx9s\" (UID: \"515611f7-a360-418d-b451-b9a1ae66f011\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.272652 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.272586 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfshm\" (UniqueName: \"kubernetes.io/projected/d76607bd-cdff-4883-9d9f-5cea32d17b2c-kube-api-access-qfshm\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.273642 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.273625 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d225t\" (UniqueName: \"kubernetes.io/projected/247e6f19-cb23-47a1-9176-374e21a56b59-kube-api-access-d225t\") pod \"openshift-state-metrics-9d44df66c-q5kbg\" (UID: \"247e6f19-cb23-47a1-9176-374e21a56b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.368108 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.368075 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" Apr 22 18:21:52.429370 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.428920 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" Apr 22 18:21:52.511950 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.511918 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg"] Apr 22 18:21:52.521760 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:52.516255 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247e6f19_cb23_47a1_9176_374e21a56b59.slice/crio-e12e487cb220a332a4411c28ff20f8daa7be5028cfe849496db684b33ccb0f62 WatchSource:0}: Error finding container e12e487cb220a332a4411c28ff20f8daa7be5028cfe849496db684b33ccb0f62: Status 404 returned error can't find the container with id e12e487cb220a332a4411c28ff20f8daa7be5028cfe849496db684b33ccb0f62 Apr 22 18:21:52.567162 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.567131 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fxx9s"] Apr 22 18:21:52.570508 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:52.570484 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod515611f7_a360_418d_b451_b9a1ae66f011.slice/crio-3edf1670aa349102876de6c04ba5ebc6f8f5c7e5f53a8e36e0c5cea599889e58 WatchSource:0}: Error finding container 3edf1670aa349102876de6c04ba5ebc6f8f5c7e5f53a8e36e0c5cea599889e58: Status 404 returned error can't find the container with id 3edf1670aa349102876de6c04ba5ebc6f8f5c7e5f53a8e36e0c5cea599889e58 Apr 22 18:21:52.619515 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.619314 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" event={"ID":"515611f7-a360-418d-b451-b9a1ae66f011","Type":"ContainerStarted","Data":"3edf1670aa349102876de6c04ba5ebc6f8f5c7e5f53a8e36e0c5cea599889e58"} Apr 22 18:21:52.621616 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.621529 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" event={"ID":"247e6f19-cb23-47a1-9176-374e21a56b59","Type":"ContainerStarted","Data":"0f36510b9a3e622223bd2b2f40dc05f29007f8fbb9cecce2e1dd37ddab2f8ed5"} Apr 22 18:21:52.621616 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.621585 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" event={"ID":"247e6f19-cb23-47a1-9176-374e21a56b59","Type":"ContainerStarted","Data":"e12e487cb220a332a4411c28ff20f8daa7be5028cfe849496db684b33ccb0f62"} Apr 22 18:21:52.765826 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.765786 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:52.768360 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:52.768336 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d76607bd-cdff-4883-9d9f-5cea32d17b2c-node-exporter-tls\") pod \"node-exporter-6n8s9\" (UID: \"d76607bd-cdff-4883-9d9f-5cea32d17b2c\") " pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:53.017846 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:53.017774 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6n8s9" Apr 22 18:21:53.026468 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:53.026434 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76607bd_cdff_4883_9d9f_5cea32d17b2c.slice/crio-b49bd2d6266af3c95c711ae1bc9239e78eb11d7a69f497aa63d48c3b0bb7c118 WatchSource:0}: Error finding container b49bd2d6266af3c95c711ae1bc9239e78eb11d7a69f497aa63d48c3b0bb7c118: Status 404 returned error can't find the container with id b49bd2d6266af3c95c711ae1bc9239e78eb11d7a69f497aa63d48c3b0bb7c118 Apr 22 18:21:53.590137 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:53.590111 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t4nt7" Apr 22 18:21:53.631420 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:53.631365 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" event={"ID":"247e6f19-cb23-47a1-9176-374e21a56b59","Type":"ContainerStarted","Data":"05ae76417d528f7f63a0d6393550959be99d60d7cabdc86cab4532ba8e5804f6"} Apr 22 18:21:53.633018 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:53.632973 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8s9" event={"ID":"d76607bd-cdff-4883-9d9f-5cea32d17b2c","Type":"ContainerStarted","Data":"b49bd2d6266af3c95c711ae1bc9239e78eb11d7a69f497aa63d48c3b0bb7c118"} Apr 22 18:21:54.011202 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.011126 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:21:54.032214 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.032185 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:21:54.032389 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.032326 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.034603 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.034578 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:21:54.035194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.034965 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:21:54.035194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.034977 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:21:54.035194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.035038 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jj6h5\"" Apr 22 18:21:54.035194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.034975 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:21:54.035194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.034975 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:21:54.076906 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.076878 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.077056 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.076921 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zx5d\" (UniqueName: \"kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.077056 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.076995 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.077165 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.077081 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.077165 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.077118 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.077266 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.077174 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178188 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178155 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178210 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178243 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zx5d\" (UniqueName: \"kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178273 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178328 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.178371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.178345 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.179054 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.179031 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.179152 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.179031 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.179328 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.179306 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.180916 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.180889 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.181187 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.181171 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.190023 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.190002 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zx5d\" (UniqueName: \"kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d\") pod \"console-79bddf957-85jtq\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:54.343482 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:54.343438 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:21:55.344172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.344143 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:21:55.361281 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:55.359745 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc52fe620_272c_4caf_8c9b_e7481e236fdd.slice/crio-9627103ccdc659c5f4f419c609f60bac424398926915a7655c636cc6d53b3cef WatchSource:0}: Error finding container 9627103ccdc659c5f4f419c609f60bac424398926915a7655c636cc6d53b3cef: Status 404 returned error can't find the container with id 9627103ccdc659c5f4f419c609f60bac424398926915a7655c636cc6d53b3cef Apr 22 18:21:55.642172 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.642135 2551 generic.go:358] "Generic (PLEG): container finished" podID="d76607bd-cdff-4883-9d9f-5cea32d17b2c" containerID="fa18e08cfff2be27de39d14d1b6b16c715aba6426aa4cd07e1df2ab534cd54df" exitCode=0 Apr 22 18:21:55.642329 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.642176 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8s9" event={"ID":"d76607bd-cdff-4883-9d9f-5cea32d17b2c","Type":"ContainerDied","Data":"fa18e08cfff2be27de39d14d1b6b16c715aba6426aa4cd07e1df2ab534cd54df"} Apr 22 18:21:55.643773 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.643748 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bddf957-85jtq" event={"ID":"c52fe620-272c-4caf-8c9b-e7481e236fdd","Type":"ContainerStarted","Data":"9627103ccdc659c5f4f419c609f60bac424398926915a7655c636cc6d53b3cef"} Apr 22 18:21:55.649884 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.649050 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" event={"ID":"515611f7-a360-418d-b451-b9a1ae66f011","Type":"ContainerStarted","Data":"75a0038b05c9fd514b9de64aaaef4685d43d0c52a7587fa8c35b25abf1b856c9"} Apr 22 18:21:55.649884 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.649081 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" event={"ID":"515611f7-a360-418d-b451-b9a1ae66f011","Type":"ContainerStarted","Data":"9262ad42d4e8240695e5bc2cf858ccec4acc7e9d047c13b2de726a168f599973"} Apr 22 18:21:55.649884 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.649095 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" event={"ID":"515611f7-a360-418d-b451-b9a1ae66f011","Type":"ContainerStarted","Data":"044be7c0d57d0cebf527a59738757ea4fb72c390c476bac29eb54e0e35c9467d"} Apr 22 18:21:55.657029 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.656993 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" event={"ID":"247e6f19-cb23-47a1-9176-374e21a56b59","Type":"ContainerStarted","Data":"8e19791e132c03e20fbf5e1510746c33c9139d3bea76c574a26cc053f9881b39"} Apr 22 18:21:55.680384 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.680338 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-fxx9s" podStartSLOduration=1.036988995 podStartE2EDuration="3.680321781s" podCreationTimestamp="2026-04-22 18:21:52 +0000 UTC" firstStartedPulling="2026-04-22 18:21:52.572766617 +0000 UTC m=+62.880357910" lastFinishedPulling="2026-04-22 18:21:55.216099389 +0000 UTC m=+65.523690696" observedRunningTime="2026-04-22 18:21:55.679077881 +0000 UTC m=+65.986669197" watchObservedRunningTime="2026-04-22 18:21:55.680321781 +0000 UTC m=+65.987913098" Apr 22 18:21:55.696409 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.696363 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-q5kbg" podStartSLOduration=1.136958175 podStartE2EDuration="3.696346889s" podCreationTimestamp="2026-04-22 18:21:52 +0000 UTC" firstStartedPulling="2026-04-22 18:21:52.656493155 +0000 UTC m=+62.964084447" lastFinishedPulling="2026-04-22 18:21:55.215881854 +0000 UTC m=+65.523473161" observedRunningTime="2026-04-22 18:21:55.696014314 +0000 UTC m=+66.003605646" watchObservedRunningTime="2026-04-22 18:21:55.696346889 +0000 UTC m=+66.003938205" Apr 22 18:21:55.991957 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.991916 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:55.994073 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:55.994047 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:56.005536 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.005508 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f76d2d65-0522-4be1-946b-e6c684435f2d-metrics-certs\") pod \"network-metrics-daemon-vvfq2\" (UID: \"f76d2d65-0522-4be1-946b-e6c684435f2d\") " pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:56.064422 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.064352 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bn55n\"" Apr 22 18:21:56.072352 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.072312 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vvfq2" Apr 22 18:21:56.092531 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.092501 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:56.095253 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.095227 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:56.105361 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.105338 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:56.116372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.116350 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl9n6\" (UniqueName: \"kubernetes.io/projected/851a7f2e-934f-4bc2-a805-c7e37bdc4757-kube-api-access-gl9n6\") pod \"network-check-target-2h4lz\" (UID: \"851a7f2e-934f-4bc2-a805-c7e37bdc4757\") " pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:56.219292 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.219261 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vvfq2"] Apr 22 18:21:56.225820 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:56.225788 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76d2d65_0522_4be1_946b_e6c684435f2d.slice/crio-7b0e36ad70dbb3c7984fba2e946db3e294309dc25b8780703ad1187b344fbc2f WatchSource:0}: Error finding container 7b0e36ad70dbb3c7984fba2e946db3e294309dc25b8780703ad1187b344fbc2f: Status 404 returned error can't find the container with id 7b0e36ad70dbb3c7984fba2e946db3e294309dc25b8780703ad1187b344fbc2f Apr 22 18:21:56.358833 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.358791 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zkrzn\"" Apr 22 18:21:56.367013 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.366619 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:21:56.489848 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.489815 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-b6647d4f9-jvs9s"] Apr 22 18:21:56.507752 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:21:56.507718 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851a7f2e_934f_4bc2_a805_c7e37bdc4757.slice/crio-70746cc7bfe7295dc7a04b9bc14c73f712fac593e01b4c0cb1bc833dfa2718ea WatchSource:0}: Error finding container 70746cc7bfe7295dc7a04b9bc14c73f712fac593e01b4c0cb1bc833dfa2718ea: Status 404 returned error can't find the container with id 70746cc7bfe7295dc7a04b9bc14c73f712fac593e01b4c0cb1bc833dfa2718ea Apr 22 18:21:56.518395 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.518373 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b6647d4f9-jvs9s"] Apr 22 18:21:56.518395 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.518400 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2h4lz"] Apr 22 18:21:56.518691 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.518513 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.521075 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.521053 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:21:56.521075 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.521067 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:21:56.523529 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.523506 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2kevkpgan318g\"" Apr 22 18:21:56.523746 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.523731 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:21:56.523900 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.523870 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:21:56.524091 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.524070 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sgpbx\"" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597062 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-client-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597123 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-client-certs\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597154 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-tls\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597187 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c97l6\" (UniqueName: \"kubernetes.io/projected/dfa03dce-2314-47b6-bfbe-8adf7c128630-kube-api-access-c97l6\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597223 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dfa03dce-2314-47b6-bfbe-8adf7c128630-audit-log\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597259 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.597423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.597310 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-metrics-server-audit-profiles\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.662353 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.662263 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vvfq2" event={"ID":"f76d2d65-0522-4be1-946b-e6c684435f2d","Type":"ContainerStarted","Data":"7b0e36ad70dbb3c7984fba2e946db3e294309dc25b8780703ad1187b344fbc2f"} Apr 22 18:21:56.664043 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.664014 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2h4lz" event={"ID":"851a7f2e-934f-4bc2-a805-c7e37bdc4757","Type":"ContainerStarted","Data":"70746cc7bfe7295dc7a04b9bc14c73f712fac593e01b4c0cb1bc833dfa2718ea"} Apr 22 18:21:56.668381 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.668349 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8s9" event={"ID":"d76607bd-cdff-4883-9d9f-5cea32d17b2c","Type":"ContainerStarted","Data":"cf500ff2a7e8736144d149ef91f4fa8075fcbed8beb3cdded7b126f36c1ec5a1"} Apr 22 18:21:56.668493 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.668387 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8s9" event={"ID":"d76607bd-cdff-4883-9d9f-5cea32d17b2c","Type":"ContainerStarted","Data":"713410e0c0ad13190fc48ddeced322d94889df3f7b05ef263936a29265273878"} Apr 22 18:21:56.689634 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.689540 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6n8s9" podStartSLOduration=2.502015675 podStartE2EDuration="4.689522434s" podCreationTimestamp="2026-04-22 18:21:52 +0000 UTC" firstStartedPulling="2026-04-22 18:21:53.028465182 +0000 UTC m=+63.336056481" lastFinishedPulling="2026-04-22 18:21:55.215971946 +0000 UTC m=+65.523563240" observedRunningTime="2026-04-22 18:21:56.688237653 +0000 UTC m=+66.995828968" watchObservedRunningTime="2026-04-22 18:21:56.689522434 +0000 UTC m=+66.997113751" Apr 22 18:21:56.698739 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.698710 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-metrics-server-audit-profiles\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.698873 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.698756 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-client-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.699035 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.699014 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-client-certs\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.699126 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.699049 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-tls\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.699126 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.699101 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c97l6\" (UniqueName: \"kubernetes.io/projected/dfa03dce-2314-47b6-bfbe-8adf7c128630-kube-api-access-c97l6\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.699218 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.699189 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dfa03dce-2314-47b6-bfbe-8adf7c128630-audit-log\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.699280 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.699263 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.700316 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.700291 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dfa03dce-2314-47b6-bfbe-8adf7c128630-audit-log\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.701192 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.701163 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-metrics-server-audit-profiles\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.701275 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.701252 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa03dce-2314-47b6-bfbe-8adf7c128630-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.703485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.703444 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-tls\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.703642 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.703621 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-secret-metrics-server-client-certs\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.704026 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.704007 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa03dce-2314-47b6-bfbe-8adf7c128630-client-ca-bundle\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.709231 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.709200 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c97l6\" (UniqueName: \"kubernetes.io/projected/dfa03dce-2314-47b6-bfbe-8adf7c128630-kube-api-access-c97l6\") pod \"metrics-server-b6647d4f9-jvs9s\" (UID: \"dfa03dce-2314-47b6-bfbe-8adf7c128630\") " pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:56.830297 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:56.829908 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:21:57.012130 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:57.012098 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b6647d4f9-jvs9s"] Apr 22 18:21:57.671928 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:57.671851 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" event={"ID":"dfa03dce-2314-47b6-bfbe-8adf7c128630","Type":"ContainerStarted","Data":"351dee62134dd1c0c95c812cdbba3cf0365ec3e3597f02bc6622a78b5e4243e8"} Apr 22 18:21:57.674757 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:57.674711 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vvfq2" event={"ID":"f76d2d65-0522-4be1-946b-e6c684435f2d","Type":"ContainerStarted","Data":"68832da5b77bafdf7067d1bfc8a663a0905f8391cdcc2cbed9700de927502134"} Apr 22 18:21:58.412031 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.411994 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:21:58.417361 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.417329 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.420762 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.420734 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:21:58.421017 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.420997 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-di2c6c8gp8lps\"" Apr 22 18:21:58.422221 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.421754 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:21:58.426598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.422359 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:21:58.426598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.422986 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:21:58.426598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.424154 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wnslq\"" Apr 22 18:21:58.426598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.424383 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:21:58.426598 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.426076 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:21:58.427113 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.427093 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:21:58.427255 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.427233 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:21:58.427403 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.427387 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:21:58.427662 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.427635 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:21:58.429425 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.429403 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:21:58.431283 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.431260 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:21:58.436337 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.436317 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513496 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513585 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513615 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513641 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513667 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513694 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513741 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513765 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513826 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513853 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513891 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513920 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513965 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.513989 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.514017 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.514331 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.514048 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.515228 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.514071 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.515228 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.514117 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615020 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.614980 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615028 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615048 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615068 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615092 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615122 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615148 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615180 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615205 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615240 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615272 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615319 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615344 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615370 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615395 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615420 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615871 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615464 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.615871 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.615493 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.618053 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.618028 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.620302 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.618948 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.620302 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.619986 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.621433 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.621263 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.621777 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.621735 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.628589 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.627497 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.628589 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.628111 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.628752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.628732 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.629252 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.629114 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.629794 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.629322 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.629794 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.629751 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.629939 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.629829 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.630690 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.630636 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.631345 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.631304 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.631932 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.631905 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.632022 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.631938 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.635129 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.634710 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.638381 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.638330 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:21:58.733319 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:21:58.732830 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:22:07.724208 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:07.724174 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:22:08.715009 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.714338 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bddf957-85jtq" event={"ID":"c52fe620-272c-4caf-8c9b-e7481e236fdd","Type":"ContainerStarted","Data":"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0"} Apr 22 18:22:08.718985 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.718509 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vvfq2" event={"ID":"f76d2d65-0522-4be1-946b-e6c684435f2d","Type":"ContainerStarted","Data":"90c7e82fd8283c5e1eaeffa4152161eef1a14f4262252b881944292010c29981"} Apr 22 18:22:08.720803 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.720768 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"fa005e74dd0e4f1fd640b09d33ee5e9531a111c030d291f390799b7a5ce358de"} Apr 22 18:22:08.722784 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.722757 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-84r4n" event={"ID":"95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74","Type":"ContainerStarted","Data":"92a50e1a39580ee3846c8c304968fcaf70fd3e840da6bfd7de2f34ef7de65a5d"} Apr 22 18:22:08.723246 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.723227 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:22:08.725264 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.725238 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2h4lz" event={"ID":"851a7f2e-934f-4bc2-a805-c7e37bdc4757","Type":"ContainerStarted","Data":"2d4ba5af04afca5585e2f7f12cf3213d6b10e6f7b87b9cc769d08830457bb830"} Apr 22 18:22:08.725539 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.725324 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:22:08.727915 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.727890 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" event={"ID":"dfa03dce-2314-47b6-bfbe-8adf7c128630","Type":"ContainerStarted","Data":"0de6b786ef560992503b71181e63f97333decbca7c7581cde8181c6c13094f8c"} Apr 22 18:22:08.736252 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.736198 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79bddf957-85jtq" podStartSLOduration=3.120982088 podStartE2EDuration="15.736181711s" podCreationTimestamp="2026-04-22 18:21:53 +0000 UTC" firstStartedPulling="2026-04-22 18:21:55.362744462 +0000 UTC m=+65.670335758" lastFinishedPulling="2026-04-22 18:22:07.977944088 +0000 UTC m=+78.285535381" observedRunningTime="2026-04-22 18:22:08.734429176 +0000 UTC m=+79.042020489" watchObservedRunningTime="2026-04-22 18:22:08.736181711 +0000 UTC m=+79.043773022" Apr 22 18:22:08.737489 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.737448 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-84r4n" Apr 22 18:22:08.753657 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.752665 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-84r4n" podStartSLOduration=2.8092032160000002 podStartE2EDuration="25.752648886s" podCreationTimestamp="2026-04-22 18:21:43 +0000 UTC" firstStartedPulling="2026-04-22 18:21:44.647801214 +0000 UTC m=+54.955392507" lastFinishedPulling="2026-04-22 18:22:07.59124687 +0000 UTC m=+77.898838177" observedRunningTime="2026-04-22 18:22:08.751708933 +0000 UTC m=+79.059300247" watchObservedRunningTime="2026-04-22 18:22:08.752648886 +0000 UTC m=+79.060240201" Apr 22 18:22:08.769883 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.769832 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2h4lz" podStartSLOduration=67.302043058 podStartE2EDuration="1m18.769814488s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:21:56.510209571 +0000 UTC m=+66.817800870" lastFinishedPulling="2026-04-22 18:22:07.977981001 +0000 UTC m=+78.285572300" observedRunningTime="2026-04-22 18:22:08.767680578 +0000 UTC m=+79.075271919" watchObservedRunningTime="2026-04-22 18:22:08.769814488 +0000 UTC m=+79.077405804" Apr 22 18:22:08.782533 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.782490 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vvfq2" podStartSLOduration=77.524496785 podStartE2EDuration="1m18.78247693s" podCreationTimestamp="2026-04-22 18:20:50 +0000 UTC" firstStartedPulling="2026-04-22 18:21:56.228431351 +0000 UTC m=+66.536022644" lastFinishedPulling="2026-04-22 18:21:57.486411482 +0000 UTC m=+67.794002789" observedRunningTime="2026-04-22 18:22:08.781980156 +0000 UTC m=+79.089571474" watchObservedRunningTime="2026-04-22 18:22:08.78247693 +0000 UTC m=+79.090068245" Apr 22 18:22:08.799735 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:08.799681 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" podStartSLOduration=2.198060693 podStartE2EDuration="12.799667046s" podCreationTimestamp="2026-04-22 18:21:56 +0000 UTC" firstStartedPulling="2026-04-22 18:21:57.437018753 +0000 UTC m=+67.744610052" lastFinishedPulling="2026-04-22 18:22:08.038625093 +0000 UTC m=+78.346216405" observedRunningTime="2026-04-22 18:22:08.797227376 +0000 UTC m=+79.104818690" watchObservedRunningTime="2026-04-22 18:22:08.799667046 +0000 UTC m=+79.107258360" Apr 22 18:22:09.732711 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:09.732674 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" exitCode=0 Apr 22 18:22:09.733196 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:09.732838 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} Apr 22 18:22:13.748460 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:13.748424 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} Apr 22 18:22:13.748910 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:13.748469 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} Apr 22 18:22:14.349450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:14.348916 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:14.349450 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:14.348953 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:14.351597 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:14.351520 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:14.756130 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:14.756051 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:15.184231 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:15.184196 2551 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:22:16.761110 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.761075 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} Apr 22 18:22:16.761585 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.761116 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} Apr 22 18:22:16.761585 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.761132 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} Apr 22 18:22:16.761585 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.761145 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerStarted","Data":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} Apr 22 18:22:16.794812 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.794766 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=10.729158501 podStartE2EDuration="18.794752063s" podCreationTimestamp="2026-04-22 18:21:58 +0000 UTC" firstStartedPulling="2026-04-22 18:22:07.980879066 +0000 UTC m=+78.288470374" lastFinishedPulling="2026-04-22 18:22:16.046472632 +0000 UTC m=+86.354063936" observedRunningTime="2026-04-22 18:22:16.793292932 +0000 UTC m=+87.100884273" watchObservedRunningTime="2026-04-22 18:22:16.794752063 +0000 UTC m=+87.102343421" Apr 22 18:22:16.831003 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.830971 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:22:16.831003 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:16.831009 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:22:18.733732 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:18.733694 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:22:36.835520 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:36.835493 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:22:36.839281 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:36.839259 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b6647d4f9-jvs9s" Apr 22 18:22:39.735544 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:39.735515 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2h4lz" Apr 22 18:22:41.783056 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:41.782998 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79bddf957-85jtq" podUID="c52fe620-272c-4caf-8c9b-e7481e236fdd" containerName="console" containerID="cri-o://d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0" gracePeriod=15 Apr 22 18:22:42.067737 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.067716 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79bddf957-85jtq_c52fe620-272c-4caf-8c9b-e7481e236fdd/console/0.log" Apr 22 18:22:42.067847 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.067793 2551 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:42.183205 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183168 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183215 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zx5d\" (UniqueName: \"kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183263 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183279 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183302 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183372 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183320 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config\") pod \"c52fe620-272c-4caf-8c9b-e7481e236fdd\" (UID: \"c52fe620-272c-4caf-8c9b-e7481e236fdd\") " Apr 22 18:22:42.183762 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183733 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:22:42.183870 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.183754 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca" (OuterVolumeSpecName: "service-ca") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:22:42.188986 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.184115 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config" (OuterVolumeSpecName: "console-config") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:22:42.189122 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.189090 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:22:42.189122 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.189102 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d" (OuterVolumeSpecName: "kube-api-access-7zx5d") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "kube-api-access-7zx5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:22:42.189242 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.189113 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c52fe620-272c-4caf-8c9b-e7481e236fdd" (UID: "c52fe620-272c-4caf-8c9b-e7481e236fdd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:22:42.284325 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284296 2551 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-oauth-serving-cert\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.284325 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284322 2551 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-config\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.284325 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284332 2551 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-serving-cert\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.284516 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284341 2551 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zx5d\" (UniqueName: \"kubernetes.io/projected/c52fe620-272c-4caf-8c9b-e7481e236fdd-kube-api-access-7zx5d\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.284516 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284350 2551 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c52fe620-272c-4caf-8c9b-e7481e236fdd-service-ca\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.284516 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.284358 2551 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c52fe620-272c-4caf-8c9b-e7481e236fdd-console-oauth-config\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:22:42.833510 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833436 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79bddf957-85jtq_c52fe620-272c-4caf-8c9b-e7481e236fdd/console/0.log" Apr 22 18:22:42.833510 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833476 2551 generic.go:358] "Generic (PLEG): container finished" podID="c52fe620-272c-4caf-8c9b-e7481e236fdd" containerID="d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0" exitCode=2 Apr 22 18:22:42.834074 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833511 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bddf957-85jtq" event={"ID":"c52fe620-272c-4caf-8c9b-e7481e236fdd","Type":"ContainerDied","Data":"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0"} Apr 22 18:22:42.834074 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833578 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bddf957-85jtq" event={"ID":"c52fe620-272c-4caf-8c9b-e7481e236fdd","Type":"ContainerDied","Data":"9627103ccdc659c5f4f419c609f60bac424398926915a7655c636cc6d53b3cef"} Apr 22 18:22:42.834074 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833590 2551 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bddf957-85jtq" Apr 22 18:22:42.834074 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.833596 2551 scope.go:117] "RemoveContainer" containerID="d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0" Apr 22 18:22:42.844583 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.844541 2551 scope.go:117] "RemoveContainer" containerID="d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0" Apr 22 18:22:42.844853 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:22:42.844834 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0\": container with ID starting with d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0 not found: ID does not exist" containerID="d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0" Apr 22 18:22:42.844917 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.844860 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0"} err="failed to get container status \"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0\": rpc error: code = NotFound desc = could not find container \"d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0\": container with ID starting with d594e5d6e9943cf4e0e786395d0b66cb41a4c076a53e9c4a4525f46ef978daf0 not found: ID does not exist" Apr 22 18:22:42.852501 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.852481 2551 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:22:42.858686 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:42.858663 2551 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79bddf957-85jtq"] Apr 22 18:22:44.345780 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:44.345745 2551 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52fe620-272c-4caf-8c9b-e7481e236fdd" path="/var/lib/kubelet/pods/c52fe620-272c-4caf-8c9b-e7481e236fdd/volumes" Apr 22 18:22:58.733703 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:58.733670 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:22:58.753761 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:58.753733 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:22:58.908970 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:22:58.908946 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:16.924630 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.924593 2551 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:16.925854 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.925816 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="prometheus" containerID="cri-o://2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" gracePeriod=600 Apr 22 18:23:16.926083 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.925925 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-thanos" containerID="cri-o://8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" gracePeriod=600 Apr 22 18:23:16.926181 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.925950 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy" containerID="cri-o://22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" gracePeriod=600 Apr 22 18:23:16.926181 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.925966 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="thanos-sidecar" containerID="cri-o://10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" gracePeriod=600 Apr 22 18:23:16.926293 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.926023 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="config-reloader" containerID="cri-o://7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" gracePeriod=600 Apr 22 18:23:16.926345 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:16.926274 2551 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-web" containerID="cri-o://abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" gracePeriod=600 Apr 22 18:23:17.169312 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.169283 2551 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:17.234914 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.234827 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.234914 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.234874 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.234915 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.234959 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.234988 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235054 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235079 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235114 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235106 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235130 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235166 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235203 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235228 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235252 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235277 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235313 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235314 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:17.235410 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.235346 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.237070 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.236346 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:23:17.237070 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.236581 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.237070 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.236620 2551 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"9cb64b9f-bd16-4086-96a7-e642876679dd\" (UID: \"9cb64b9f-bd16-4086-96a7-e642876679dd\") " Apr 22 18:23:17.237070 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.236904 2551 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-db\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.237070 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.236926 2551 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-metrics-client-ca\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.237671 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.237805 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239145 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239227 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239251 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.239358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239299 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config" (OuterVolumeSpecName: "config") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.239734 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239625 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:23:17.239734 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.239696 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.240615 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.240578 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.240813 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.240786 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:23:17.241049 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.241008 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.241258 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.241231 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.242489 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.242447 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j" (OuterVolumeSpecName: "kube-api-access-64f2j") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "kube-api-access-64f2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:23:17.242937 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.242911 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out" (OuterVolumeSpecName: "config-out") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:23:17.243014 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.242981 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.250977 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.250942 2551 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config" (OuterVolumeSpecName: "web-config") pod "9cb64b9f-bd16-4086-96a7-e642876679dd" (UID: "9cb64b9f-bd16-4086-96a7-e642876679dd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:23:17.337470 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337432 2551 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-kube-api-access-64f2j\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337470 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337467 2551 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337482 2551 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-grpc-tls\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337495 2551 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-config\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337509 2551 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337521 2551 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-kube-rbac-proxy\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337535 2551 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cb64b9f-bd16-4086-96a7-e642876679dd-config-out\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337574 2551 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337587 2551 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-metrics-client-certs\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337600 2551 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337613 2551 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337626 2551 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-web-config\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337639 2551 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb64b9f-bd16-4086-96a7-e642876679dd-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337652 2551 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337667 2551 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cb64b9f-bd16-4086-96a7-e642876679dd-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.337723 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.337680 2551 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cb64b9f-bd16-4086-96a7-e642876679dd-tls-assets\") on node \"ip-10-0-143-88.ec2.internal\" DevicePath \"\"" Apr 22 18:23:17.948807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948781 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" exitCode=0 Apr 22 18:23:17.948807 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948803 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" exitCode=0 Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948811 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" exitCode=0 Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948819 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" exitCode=0 Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948825 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" exitCode=0 Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948830 2551 generic.go:358] "Generic (PLEG): container finished" podID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" exitCode=0 Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948863 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948885 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948896 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948906 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948915 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948924 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948933 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9cb64b9f-bd16-4086-96a7-e642876679dd","Type":"ContainerDied","Data":"fa005e74dd0e4f1fd640b09d33ee5e9531a111c030d291f390799b7a5ce358de"} Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948946 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:17.949249 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.948990 2551 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:17.956786 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.956684 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:17.963333 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.963315 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:17.971196 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.971176 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:17.972243 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.972158 2551 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:17.976141 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.976118 2551 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:17.978479 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.978461 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:17.984466 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.984451 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:17.990477 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.990458 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:17.996145 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996129 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:17.996382 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.996363 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:17.996437 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996390 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:17.996437 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996408 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:17.996678 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.996652 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:17.996742 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996680 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:17.996742 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996699 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:17.996932 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.996914 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:17.996986 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996948 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:17.996986 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.996963 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:17.997191 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.997174 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:17.997258 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997193 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:17.997258 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997207 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:17.997457 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.997436 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:17.997565 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997486 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:17.997565 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997500 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:17.997796 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.997774 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:17.997880 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997810 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:17.997880 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.997832 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:17.998085 ip-10-0-143-88 kubenswrapper[2551]: E0422 18:23:17.998070 2551 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:17.998130 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998089 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:17.998130 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998101 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:17.998369 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998341 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:17.998369 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998361 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:17.998604 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998585 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:17.998659 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998606 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:17.998812 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998794 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:17.998877 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.998814 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:17.999035 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999018 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:17.999079 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999035 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:17.999242 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999224 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:17.999311 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999243 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:17.999425 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999409 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:17.999485 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999428 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:17.999666 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999649 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:17.999721 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999666 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:17.999866 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999849 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:17.999907 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:17.999867 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:18.000054 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000040 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:18.000095 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000054 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:18.000223 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000207 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:18.000269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000223 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:18.000399 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000384 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:18.000445 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000398 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:18.000584 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000570 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:18.000626 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000585 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:18.000736 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000717 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:18.000794 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000737 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:18.000944 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000929 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:18.000988 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.000944 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:18.001124 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001108 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:18.001185 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001125 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:18.001324 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001295 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:18.001389 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001326 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:18.001511 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001493 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:18.001580 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001513 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:18.001715 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001698 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:18.001781 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001716 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:18.001930 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001910 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:18.001975 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.001931 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:18.002153 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002136 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:18.002216 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002155 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:18.002362 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002346 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:18.002402 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002363 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:18.002541 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002527 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:18.002624 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002542 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:18.002774 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002760 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:18.002814 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002775 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:18.002958 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002940 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:18.003002 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.002958 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:18.003145 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003131 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:18.003195 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003145 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:18.003338 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003322 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:18.003377 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003338 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:18.003522 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003508 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:18.003584 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003523 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:18.003740 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003719 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:18.003784 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003741 2551 scope.go:117] "RemoveContainer" containerID="8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592" Apr 22 18:23:18.003956 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003938 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592"} err="failed to get container status \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": rpc error: code = NotFound desc = could not find container \"8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592\": container with ID starting with 8560eefeeb7ab4d48df3eb4d1ffee98eb0dfccf700b302b1a7858e81ec83f592 not found: ID does not exist" Apr 22 18:23:18.004020 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.003957 2551 scope.go:117] "RemoveContainer" containerID="22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f" Apr 22 18:23:18.004156 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004139 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f"} err="failed to get container status \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": rpc error: code = NotFound desc = could not find container \"22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f\": container with ID starting with 22cd5cadff0f2487ebdf83b41145ea4cbd6b326c63e47449337c51cd03cf245f not found: ID does not exist" Apr 22 18:23:18.004201 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004158 2551 scope.go:117] "RemoveContainer" containerID="abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92" Apr 22 18:23:18.004364 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004348 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92"} err="failed to get container status \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": rpc error: code = NotFound desc = could not find container \"abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92\": container with ID starting with abf8d98ac5593a67fea2b8a5b50523b02991bdc2bf38410f05d1e98582d65e92 not found: ID does not exist" Apr 22 18:23:18.004364 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004363 2551 scope.go:117] "RemoveContainer" containerID="10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634" Apr 22 18:23:18.004571 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004540 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634"} err="failed to get container status \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": rpc error: code = NotFound desc = could not find container \"10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634\": container with ID starting with 10ff69a52400b5b9e547e5517cfaaea421cb7d4e0c73176c0dc98d7d12341634 not found: ID does not exist" Apr 22 18:23:18.004620 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004572 2551 scope.go:117] "RemoveContainer" containerID="7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8" Apr 22 18:23:18.004752 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004733 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8"} err="failed to get container status \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": rpc error: code = NotFound desc = could not find container \"7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8\": container with ID starting with 7c27b034e6649db708e58e2ae77ee48439ee08c1a2ddb2aed8580307331ac0c8 not found: ID does not exist" Apr 22 18:23:18.004799 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004754 2551 scope.go:117] "RemoveContainer" containerID="2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60" Apr 22 18:23:18.004971 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004955 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60"} err="failed to get container status \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": rpc error: code = NotFound desc = could not find container \"2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60\": container with ID starting with 2ff124591345ce1b48fef1929d131f8b58da516b55acbcc0838441cc5c56cd60 not found: ID does not exist" Apr 22 18:23:18.005014 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.004972 2551 scope.go:117] "RemoveContainer" containerID="7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2" Apr 22 18:23:18.005132 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.005116 2551 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2"} err="failed to get container status \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": rpc error: code = NotFound desc = could not find container \"7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2\": container with ID starting with 7e9e429865d3468056b148fdc2b83a83e738196dd6b27184da2b4288eb6e3fb2 not found: ID does not exist" Apr 22 18:23:18.007665 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007647 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:18.007930 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007916 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-web" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007933 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-web" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007941 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-thanos" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007947 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-thanos" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007954 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="prometheus" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007959 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="prometheus" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007969 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="thanos-sidecar" Apr 22 18:23:18.007976 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007976 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="thanos-sidecar" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007986 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c52fe620-272c-4caf-8c9b-e7481e236fdd" containerName="console" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007991 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52fe620-272c-4caf-8c9b-e7481e236fdd" containerName="console" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.007997 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="config-reloader" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008002 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="config-reloader" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008012 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008017 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008024 2551 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="init-config-reloader" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008029 2551 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="init-config-reloader" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008070 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="config-reloader" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008077 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-thanos" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008083 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="c52fe620-272c-4caf-8c9b-e7481e236fdd" containerName="console" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008089 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="thanos-sidecar" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008095 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy-web" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008101 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="prometheus" Apr 22 18:23:18.008163 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.008106 2551 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" containerName="kube-rbac-proxy" Apr 22 18:23:18.013199 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.013184 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.019173 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019015 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:23:18.019371 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019351 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:23:18.019467 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019395 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:23:18.019467 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019401 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-di2c6c8gp8lps\"" Apr 22 18:23:18.019467 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019446 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:23:18.019654 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.019609 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:23:18.020427 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.020411 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:23:18.020529 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.020513 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:23:18.021182 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.021167 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wnslq\"" Apr 22 18:23:18.022467 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.022454 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:23:18.023374 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.023361 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:23:18.023644 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.023627 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:23:18.025316 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.025300 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:23:18.032614 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.031202 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042365 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042419 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042444 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042479 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042508 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042533 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042596 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.042670 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042622 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042694 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042730 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042760 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042783 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042809 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vwb\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-kube-api-access-l6vwb\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042835 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042859 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042904 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042957 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.043177 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.042991 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.044011 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.043990 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:18.143871 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143843 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143880 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143902 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143918 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143940 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143969 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.143994 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144026 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144055 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144091 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144106 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144129 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144154 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144176 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vwb\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-kube-api-access-l6vwb\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144200 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144228 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144269 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144252 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144781 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144289 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.144781 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144443 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.145054 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.144910 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.146213 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.145775 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.146213 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.145787 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.146213 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.145950 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.147616 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.147170 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.147616 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.147281 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.148612 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.147988 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.148612 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.148311 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.148714 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.148691 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed771409-6373-41c8-bf2a-e0f748bb6d03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.149085 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.149065 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.149237 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.149215 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-config\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.149390 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.149367 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.149964 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.149932 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.150350 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.150330 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed771409-6373-41c8-bf2a-e0f748bb6d03-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.150350 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.150347 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed771409-6373-41c8-bf2a-e0f748bb6d03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.151094 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.151077 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.154958 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.154940 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vwb\" (UniqueName: \"kubernetes.io/projected/ed771409-6373-41c8-bf2a-e0f748bb6d03-kube-api-access-l6vwb\") pod \"prometheus-k8s-0\" (UID: \"ed771409-6373-41c8-bf2a-e0f748bb6d03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.322511 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.322423 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:18.345975 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.345941 2551 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb64b9f-bd16-4086-96a7-e642876679dd" path="/var/lib/kubelet/pods/9cb64b9f-bd16-4086-96a7-e642876679dd/volumes" Apr 22 18:23:18.455025 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.454993 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:18.460488 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:23:18.460464 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded771409_6373_41c8_bf2a_e0f748bb6d03.slice/crio-407495eaa221fda77f0a6ecfe87419df76ea675e98f62cad7d250f3dc50fc562 WatchSource:0}: Error finding container 407495eaa221fda77f0a6ecfe87419df76ea675e98f62cad7d250f3dc50fc562: Status 404 returned error can't find the container with id 407495eaa221fda77f0a6ecfe87419df76ea675e98f62cad7d250f3dc50fc562 Apr 22 18:23:18.953568 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.953529 2551 generic.go:358] "Generic (PLEG): container finished" podID="ed771409-6373-41c8-bf2a-e0f748bb6d03" containerID="514f3d806c4f435e1085b535ea9f038e2f86435f0e828db2811f5bc534bba235" exitCode=0 Apr 22 18:23:18.953894 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.953578 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerDied","Data":"514f3d806c4f435e1085b535ea9f038e2f86435f0e828db2811f5bc534bba235"} Apr 22 18:23:18.953894 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:18.953598 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"407495eaa221fda77f0a6ecfe87419df76ea675e98f62cad7d250f3dc50fc562"} Apr 22 18:23:19.960064 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960030 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"cdb19c40f948708199b70efcc29b0d2917017270f6cb7bc4e704080e6c12890d"} Apr 22 18:23:19.960064 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960067 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"d52ca6d43c0547486b1fa56ecac6c7607607df04b11b989d15837e9d2e6b2c4c"} Apr 22 18:23:19.960440 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960077 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"9b9d96ad650dfe01adf6b71376261aefd8d2a593c5236330886e39cbb8cb00ac"} Apr 22 18:23:19.960440 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960085 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"0e5e7e758fae57594fee59a0c499680975a1937f47b9b50a93819b9ba832f885"} Apr 22 18:23:19.960440 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960092 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"729b211e72241c49f6bb65a6b2dd5dbbefebfc84c450fee50fb653a7d39a057d"} Apr 22 18:23:19.960440 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.960099 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed771409-6373-41c8-bf2a-e0f748bb6d03","Type":"ContainerStarted","Data":"f82cabb0a1d034a1a9c105359970f5e8fdbae8bda23268fe92ac688c6b047fbe"} Apr 22 18:23:19.988142 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:19.988096 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.988081463 podStartE2EDuration="2.988081463s" podCreationTimestamp="2026-04-22 18:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:19.986253089 +0000 UTC m=+150.293844403" watchObservedRunningTime="2026-04-22 18:23:19.988081463 +0000 UTC m=+150.295672777" Apr 22 18:23:23.322845 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:23.322798 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:53.227985 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.227948 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t785z"] Apr 22 18:23:53.232711 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.232687 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.235108 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.235089 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:23:53.238413 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.238389 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t785z"] Apr 22 18:23:53.328013 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.327977 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-kubelet-config\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.328013 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.328011 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-dbus\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.328199 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.328045 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/862a9bbc-5160-43bf-95af-7eed8226c026-original-pull-secret\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.429133 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.429100 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-kubelet-config\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.429133 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.429134 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-dbus\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.429358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.429167 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/862a9bbc-5160-43bf-95af-7eed8226c026-original-pull-secret\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.429358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.429235 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-kubelet-config\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.429358 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.429319 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/862a9bbc-5160-43bf-95af-7eed8226c026-dbus\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.431544 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.431528 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/862a9bbc-5160-43bf-95af-7eed8226c026-original-pull-secret\") pod \"global-pull-secret-syncer-t785z\" (UID: \"862a9bbc-5160-43bf-95af-7eed8226c026\") " pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.542035 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.541970 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t785z" Apr 22 18:23:53.657365 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:53.657335 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t785z"] Apr 22 18:23:53.660445 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:23:53.660419 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862a9bbc_5160_43bf_95af_7eed8226c026.slice/crio-ed28f0737cce24eaf834ef1b77beb2fbc902295ac76d936d231e4f5fe3f5e17e WatchSource:0}: Error finding container ed28f0737cce24eaf834ef1b77beb2fbc902295ac76d936d231e4f5fe3f5e17e: Status 404 returned error can't find the container with id ed28f0737cce24eaf834ef1b77beb2fbc902295ac76d936d231e4f5fe3f5e17e Apr 22 18:23:54.067489 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:54.067450 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t785z" event={"ID":"862a9bbc-5160-43bf-95af-7eed8226c026","Type":"ContainerStarted","Data":"ed28f0737cce24eaf834ef1b77beb2fbc902295ac76d936d231e4f5fe3f5e17e"} Apr 22 18:23:58.080837 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:58.080799 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t785z" event={"ID":"862a9bbc-5160-43bf-95af-7eed8226c026","Type":"ContainerStarted","Data":"3291112447d8f4df592580c0f68165a8f45fb502cf065fa50471cdaeca3c2f30"} Apr 22 18:23:58.094978 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:23:58.094836 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t785z" podStartSLOduration=1.223455075 podStartE2EDuration="5.094819451s" podCreationTimestamp="2026-04-22 18:23:53 +0000 UTC" firstStartedPulling="2026-04-22 18:23:53.662449625 +0000 UTC m=+183.970040918" lastFinishedPulling="2026-04-22 18:23:57.533814 +0000 UTC m=+187.841405294" observedRunningTime="2026-04-22 18:23:58.094145606 +0000 UTC m=+188.401736931" watchObservedRunningTime="2026-04-22 18:23:58.094819451 +0000 UTC m=+188.402410766" Apr 22 18:24:18.322767 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:18.322729 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:18.338219 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:18.338194 2551 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:19.151700 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:19.151662 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:24.483729 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.483691 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tgcct/must-gather-tk58b"] Apr 22 18:24:24.489428 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.489398 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.491413 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.491391 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tgcct\"/\"openshift-service-ca.crt\"" Apr 22 18:24:24.492144 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.492119 2551 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tgcct\"/\"kube-root-ca.crt\"" Apr 22 18:24:24.492144 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.492119 2551 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tgcct\"/\"default-dockercfg-p5vpd\"" Apr 22 18:24:24.494170 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.494149 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/must-gather-tk58b"] Apr 22 18:24:24.560744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.560705 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b02b1235-c329-400b-bb10-ef8c5603a00c-must-gather-output\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.560744 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.560743 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchvq\" (UniqueName: \"kubernetes.io/projected/b02b1235-c329-400b-bb10-ef8c5603a00c-kube-api-access-lchvq\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.661138 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.661104 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b02b1235-c329-400b-bb10-ef8c5603a00c-must-gather-output\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.661310 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.661144 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lchvq\" (UniqueName: \"kubernetes.io/projected/b02b1235-c329-400b-bb10-ef8c5603a00c-kube-api-access-lchvq\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.661512 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.661491 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b02b1235-c329-400b-bb10-ef8c5603a00c-must-gather-output\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.668616 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.668596 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchvq\" (UniqueName: \"kubernetes.io/projected/b02b1235-c329-400b-bb10-ef8c5603a00c-kube-api-access-lchvq\") pod \"must-gather-tk58b\" (UID: \"b02b1235-c329-400b-bb10-ef8c5603a00c\") " pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.808500 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.808422 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/must-gather-tk58b" Apr 22 18:24:24.923188 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:24.923158 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/must-gather-tk58b"] Apr 22 18:24:24.926580 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:24:24.926526 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb02b1235_c329_400b_bb10_ef8c5603a00c.slice/crio-9059a533a7b682022cfeb7637f66f08f9973e46126b1796708cd1777b7b4cba1 WatchSource:0}: Error finding container 9059a533a7b682022cfeb7637f66f08f9973e46126b1796708cd1777b7b4cba1: Status 404 returned error can't find the container with id 9059a533a7b682022cfeb7637f66f08f9973e46126b1796708cd1777b7b4cba1 Apr 22 18:24:25.153991 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:25.153957 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/must-gather-tk58b" event={"ID":"b02b1235-c329-400b-bb10-ef8c5603a00c","Type":"ContainerStarted","Data":"9059a533a7b682022cfeb7637f66f08f9973e46126b1796708cd1777b7b4cba1"} Apr 22 18:24:26.160490 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:26.160451 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/must-gather-tk58b" event={"ID":"b02b1235-c329-400b-bb10-ef8c5603a00c","Type":"ContainerStarted","Data":"39335c58aa53722afa4372117acdeb021ac31e35041698e2618cdee466b1673b"} Apr 22 18:24:26.160490 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:26.160498 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/must-gather-tk58b" event={"ID":"b02b1235-c329-400b-bb10-ef8c5603a00c","Type":"ContainerStarted","Data":"d5d7e7abe7e4659bd863c7e423171d9082cfa9c5f071e76c47eab8bfbaf26491"} Apr 22 18:24:26.176938 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:26.176872 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tgcct/must-gather-tk58b" podStartSLOduration=1.306101728 podStartE2EDuration="2.176855151s" podCreationTimestamp="2026-04-22 18:24:24 +0000 UTC" firstStartedPulling="2026-04-22 18:24:24.928381747 +0000 UTC m=+215.235973044" lastFinishedPulling="2026-04-22 18:24:25.799135175 +0000 UTC m=+216.106726467" observedRunningTime="2026-04-22 18:24:26.174782294 +0000 UTC m=+216.482373610" watchObservedRunningTime="2026-04-22 18:24:26.176855151 +0000 UTC m=+216.484446468" Apr 22 18:24:27.199866 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:27.199840 2551 ???:1] "http: TLS handshake error from 10.0.141.172:43838: EOF" Apr 22 18:24:27.200713 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:27.200693 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t785z_862a9bbc-5160-43bf-95af-7eed8226c026/global-pull-secret-syncer/0.log" Apr 22 18:24:27.351924 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:27.351895 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z9whf_2d8bb1f9-6a08-4ee4-a71a-28dee8bb54ba/konnectivity-agent/0.log" Apr 22 18:24:27.399435 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:27.399409 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-88.ec2.internal_34174f385361dd3a899cc07554d23095/haproxy/0.log" Apr 22 18:24:30.872883 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:30.872849 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fxx9s_515611f7-a360-418d-b451-b9a1ae66f011/kube-state-metrics/0.log" Apr 22 18:24:30.893885 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:30.893845 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fxx9s_515611f7-a360-418d-b451-b9a1ae66f011/kube-rbac-proxy-main/0.log" Apr 22 18:24:30.915437 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:30.915402 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fxx9s_515611f7-a360-418d-b451-b9a1ae66f011/kube-rbac-proxy-self/0.log" Apr 22 18:24:30.949682 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:30.949639 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-b6647d4f9-jvs9s_dfa03dce-2314-47b6-bfbe-8adf7c128630/metrics-server/0.log" Apr 22 18:24:31.006847 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.006817 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8s9_d76607bd-cdff-4883-9d9f-5cea32d17b2c/node-exporter/0.log" Apr 22 18:24:31.031561 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.031480 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8s9_d76607bd-cdff-4883-9d9f-5cea32d17b2c/kube-rbac-proxy/0.log" Apr 22 18:24:31.053638 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.053519 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8s9_d76607bd-cdff-4883-9d9f-5cea32d17b2c/init-textfile/0.log" Apr 22 18:24:31.227253 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.227218 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-q5kbg_247e6f19-cb23-47a1-9176-374e21a56b59/kube-rbac-proxy-main/0.log" Apr 22 18:24:31.251863 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.251825 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-q5kbg_247e6f19-cb23-47a1-9176-374e21a56b59/kube-rbac-proxy-self/0.log" Apr 22 18:24:31.276529 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.276506 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-q5kbg_247e6f19-cb23-47a1-9176-374e21a56b59/openshift-state-metrics/0.log" Apr 22 18:24:31.313286 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.313205 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/prometheus/0.log" Apr 22 18:24:31.343704 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.343501 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/config-reloader/0.log" Apr 22 18:24:31.368338 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.368265 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/thanos-sidecar/0.log" Apr 22 18:24:31.404322 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.404296 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/kube-rbac-proxy-web/0.log" Apr 22 18:24:31.433926 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.433888 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/kube-rbac-proxy/0.log" Apr 22 18:24:31.461775 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.461713 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/kube-rbac-proxy-thanos/0.log" Apr 22 18:24:31.486098 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.486073 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed771409-6373-41c8-bf2a-e0f748bb6d03/init-config-reloader/0.log" Apr 22 18:24:31.520138 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.520098 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-24bwx_4b939c03-6181-4d6d-967f-165ff0a8e2e6/prometheus-operator/0.log" Apr 22 18:24:31.546053 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.546026 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-24bwx_4b939c03-6181-4d6d-967f-165ff0a8e2e6/kube-rbac-proxy/0.log" Apr 22 18:24:31.575777 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:31.575754 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9qx2b_e62322ef-140c-4f4a-b425-1efdef4c09e9/prometheus-operator-admission-webhook/0.log" Apr 22 18:24:33.563512 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.563481 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-84r4n_95e39a23-bd8f-4cdd-8bf8-f2edb8f1fb74/download-server/0.log" Apr 22 18:24:33.656975 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.656916 2551 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25"] Apr 22 18:24:33.660473 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.660448 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.673754 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.673729 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25"] Apr 22 18:24:33.743087 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.743056 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-sys\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.743267 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.743096 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtqr\" (UniqueName: \"kubernetes.io/projected/5cdcca37-5d32-41e5-9713-7292b0130964-kube-api-access-jxtqr\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.743267 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.743172 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-lib-modules\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.743267 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.743209 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-podres\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.743267 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.743236 2551 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-proc\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844082 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844047 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-sys\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844236 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844092 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtqr\" (UniqueName: \"kubernetes.io/projected/5cdcca37-5d32-41e5-9713-7292b0130964-kube-api-access-jxtqr\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844236 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844120 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-lib-modules\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844236 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844140 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-podres\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844236 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844163 2551 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-proc\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844236 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844172 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-sys\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844416 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844271 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-podres\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844416 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844271 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-lib-modules\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.844416 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.844313 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5cdcca37-5d32-41e5-9713-7292b0130964-proc\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.851901 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.851848 2551 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtqr\" (UniqueName: \"kubernetes.io/projected/5cdcca37-5d32-41e5-9713-7292b0130964-kube-api-access-jxtqr\") pod \"perf-node-gather-daemonset-brg25\" (UID: \"5cdcca37-5d32-41e5-9713-7292b0130964\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:33.970808 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:33.970769 2551 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:34.105302 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:34.105215 2551 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25"] Apr 22 18:24:34.108189 ip-10-0-143-88 kubenswrapper[2551]: W0422 18:24:34.108161 2551 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5cdcca37_5d32_41e5_9713_7292b0130964.slice/crio-a6fdf903d40544e6db4ee6379282b3317821f9b9fd37c15848e0a19f08b16a4e WatchSource:0}: Error finding container a6fdf903d40544e6db4ee6379282b3317821f9b9fd37c15848e0a19f08b16a4e: Status 404 returned error can't find the container with id a6fdf903d40544e6db4ee6379282b3317821f9b9fd37c15848e0a19f08b16a4e Apr 22 18:24:34.188237 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:34.188197 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" event={"ID":"5cdcca37-5d32-41e5-9713-7292b0130964","Type":"ContainerStarted","Data":"a6fdf903d40544e6db4ee6379282b3317821f9b9fd37c15848e0a19f08b16a4e"} Apr 22 18:24:34.673105 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:34.673061 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4nt7_9cce88c0-9408-413b-9cd7-e5fc219d25a9/dns/0.log" Apr 22 18:24:34.696334 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:34.696309 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4nt7_9cce88c0-9408-413b-9cd7-e5fc219d25a9/kube-rbac-proxy/0.log" Apr 22 18:24:34.721235 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:34.721212 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-49bjb_c818e9ac-a0e9-433e-9277-17075adf6903/dns-node-resolver/0.log" Apr 22 18:24:35.192009 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:35.191977 2551 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" event={"ID":"5cdcca37-5d32-41e5-9713-7292b0130964","Type":"ContainerStarted","Data":"4ede0798d00ec20b9c31a391c60f9d450ccb2bd380f8c87c6d986b51ef32bbed"} Apr 22 18:24:35.192167 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:35.192111 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:35.212504 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:35.212459 2551 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" podStartSLOduration=2.212441469 podStartE2EDuration="2.212441469s" podCreationTimestamp="2026-04-22 18:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:24:35.207360416 +0000 UTC m=+225.514951743" watchObservedRunningTime="2026-04-22 18:24:35.212441469 +0000 UTC m=+225.520032809" Apr 22 18:24:35.253935 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:35.253915 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vs5cv_5ebef991-9163-4261-ab37-3aa25f981e43/node-ca/0.log" Apr 22 18:24:36.466832 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:36.466802 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-brzh2_7cdb9284-ca5f-40ac-ab03-4001937629de/serve-healthcheck-canary/0.log" Apr 22 18:24:36.847221 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:36.847194 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gbsn_23453ed9-b368-42e8-85da-2827f42c37c5/kube-rbac-proxy/0.log" Apr 22 18:24:36.872005 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:36.871983 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gbsn_23453ed9-b368-42e8-85da-2827f42c37c5/exporter/0.log" Apr 22 18:24:36.900423 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:36.900397 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gbsn_23453ed9-b368-42e8-85da-2827f42c37c5/extractor/0.log" Apr 22 18:24:41.208036 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:41.208005 2551 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-brg25" Apr 22 18:24:42.836864 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.836836 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/kube-multus-additional-cni-plugins/0.log" Apr 22 18:24:42.861109 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.861073 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/egress-router-binary-copy/0.log" Apr 22 18:24:42.890170 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.890144 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/cni-plugins/0.log" Apr 22 18:24:42.915101 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.915075 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/bond-cni-plugin/0.log" Apr 22 18:24:42.940015 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.939993 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/routeoverride-cni/0.log" Apr 22 18:24:42.963952 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.963914 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/whereabouts-cni-bincopy/0.log" Apr 22 18:24:42.993171 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:42.993120 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rz6nj_c9321020-7781-4341-88ff-61119a578087/whereabouts-cni/0.log" Apr 22 18:24:43.230061 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:43.230032 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dskdj_3c510803-84b3-405a-93b2-ef05315ce43a/kube-multus/0.log" Apr 22 18:24:43.366339 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:43.366316 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vvfq2_f76d2d65-0522-4be1-946b-e6c684435f2d/network-metrics-daemon/0.log" Apr 22 18:24:43.391735 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:43.391713 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vvfq2_f76d2d65-0522-4be1-946b-e6c684435f2d/kube-rbac-proxy/0.log" Apr 22 18:24:44.473435 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.473405 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-controller/0.log" Apr 22 18:24:44.497640 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.497565 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/0.log" Apr 22 18:24:44.498673 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.498655 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovn-acl-logging/1.log" Apr 22 18:24:44.521216 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.521199 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/kube-rbac-proxy-node/0.log" Apr 22 18:24:44.548194 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.548163 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:24:44.576699 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.576658 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/northd/0.log" Apr 22 18:24:44.601742 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.601716 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/nbdb/0.log" Apr 22 18:24:44.625045 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.625015 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/sbdb/0.log" Apr 22 18:24:44.737341 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:44.737305 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz6xk_9ee7d206-a285-4a35-a05b-5ea4d067ba50/ovnkube-controller/0.log" Apr 22 18:24:46.083462 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:46.083432 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2h4lz_851a7f2e-934f-4bc2-a805-c7e37bdc4757/network-check-target-container/0.log" Apr 22 18:24:47.178954 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:47.178909 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fkl7j_9e3da756-d433-4424-94e8-89918c5586cd/iptables-alerter/0.log" Apr 22 18:24:47.982610 ip-10-0-143-88 kubenswrapper[2551]: I0422 18:24:47.982586 2551 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nrnpx_bfcb0720-c589-4598-b75b-3f5d1ec714d3/tuned/0.log"