Apr 21 10:01:10.940478 ip-10-0-140-234 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 10:01:10.940493 ip-10-0-140-234 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 10:01:10.940502 ip-10-0-140-234 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 10:01:10.940849 ip-10-0-140-234 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 10:01:21.187717 ip-10-0-140-234 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 10:01:21.187755 ip-10-0-140-234 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 67e50ff4093b4d0aa96f34bcab39b0bb -- Apr 21 10:03:41.492861 ip-10-0-140-234 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:41.994362 ip-10-0-140-234 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:41.994362 ip-10-0-140-234 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:41.994362 ip-10-0-140-234 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:41.994362 ip-10-0-140-234 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:41.994362 ip-10-0-140-234 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:41.997281 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:41.997176 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:42.000946 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000931 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000947 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000952 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000955 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000958 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000962 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000965 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000969 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000972 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000977 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000982 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000985 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000992 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.000989 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000995 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.000999 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001002 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001005 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001008 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001011 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001013 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001016 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001018 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001021 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001024 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001027 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001029 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001032 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001035 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001037 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001040 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001043 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001045 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001048 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.001301 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001051 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001053 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001056 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001058 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001061 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001064 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001068 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001070 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001073 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001075 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001078 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001081 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001083 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001085 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001089 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001092 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001095 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001097 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001100 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.001839 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001103 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001106 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001108 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001111 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001115 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001118 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001122 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001124 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001127 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001130 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001133 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001136 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001139 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001141 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001145 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001148 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001150 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001153 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001155 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001158 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.002305 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001160 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001163 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001165 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001168 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001171 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001173 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001175 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001180 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001183 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001185 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001188 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001190 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001193 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.002804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.001196 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002973 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002981 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002984 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002987 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002991 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002994 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002996 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.002999 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003002 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003004 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003007 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003009 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003013 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003016 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003018 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003021 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003023 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003026 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.003125 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003029 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003032 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003034 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003037 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003040 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003042 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003045 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003048 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003051 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003054 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003057 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003060 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003062 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003065 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003067 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003070 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003073 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003076 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003078 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003081 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.003617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003084 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003086 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003089 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003091 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003094 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003097 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003100 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003103 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003106 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003108 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003111 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003114 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003116 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003119 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003121 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003124 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003126 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003129 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003132 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003135 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.004109 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003137 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003140 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003145 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003148 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003152 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003156 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003159 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003164 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003167 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003170 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003172 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003175 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003177 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003180 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003183 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003196 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003199 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003202 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003205 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.004609 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003208 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003212 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003215 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003218 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003221 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003223 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003226 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003228 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003231 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003306 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003314 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003320 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003325 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003329 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003332 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003337 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003342 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003345 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003349 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003352 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003356 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:42.005094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003359 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003362 2573 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003365 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003368 2573 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003371 2573 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003374 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003377 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003382 2573 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003386 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003390 2573 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003393 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003396 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003400 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003404 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003407 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003411 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003414 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003417 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003420 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003423 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003426 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003430 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003433 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003436 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003439 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:42.005643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003443 2573 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003449 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003455 2573 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003459 2573 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003462 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003465 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003468 2573 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003472 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003475 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003478 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003481 2573 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003484 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003487 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003490 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003493 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003496 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003500 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003503 2573 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003507 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003510 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003513 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003518 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003521 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003524 2573 flags.go:64] FLAG: --help="false" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003527 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.006254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003531 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003550 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003555 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003561 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003564 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003567 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003570 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003573 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003576 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003579 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003583 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003586 2573 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003589 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003592 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003595 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003598 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003601 2573 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003604 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003607 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003610 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003616 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003619 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003622 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:42.006868 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003625 2573 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003628 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003632 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003635 2573 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003638 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003642 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003646 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003650 2573 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003653 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003656 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003659 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003662 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003665 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003668 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003671 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003680 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003683 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003687 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003690 2573 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003693 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003700 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003703 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003706 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003709 2573 flags.go:64] FLAG: --port="10250" Apr 21 10:03:42.007467 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003712 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003715 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-043dc4749aba709da" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003718 2573 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003722 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003724 2573 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003727 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003730 2573 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003734 2573 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003738 2573 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003740 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003743 2573 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003747 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003750 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003753 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003756 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003760 2573 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003763 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003766 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003769 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003773 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003776 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003779 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003782 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003786 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003789 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003792 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:42.008066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003794 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003798 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003801 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003804 2573 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003806 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003812 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003815 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003818 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003823 2573 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003826 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003829 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003832 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003835 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003841 2573 flags.go:64] FLAG: --v="2" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003845 2573 flags.go:64] FLAG: --version="false" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003850 2573 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003854 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.003857 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003956 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003960 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003963 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003967 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003970 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.008742 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003976 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003979 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003982 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003984 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003987 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003990 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003994 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.003998 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004001 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004003 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004006 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004009 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004012 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004014 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004017 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004019 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004022 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004025 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004027 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004030 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.009289 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004033 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004035 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004042 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004045 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004047 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004050 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004053 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004056 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004059 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004061 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004064 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004067 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004071 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004075 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004080 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004083 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004086 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004088 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004091 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.009983 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004094 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004096 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004099 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004102 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004104 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004107 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004109 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004112 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004115 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004118 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004120 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004123 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004126 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004129 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004132 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004136 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004139 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004141 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004144 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004147 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.010835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004149 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004152 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004154 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004157 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004159 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004163 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004166 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004169 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004172 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004174 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004177 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004179 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004182 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004184 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004187 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004190 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004192 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004195 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004198 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004200 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.011699 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004203 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.004205 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.004211 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.011667 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.011693 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011764 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011772 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011779 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011784 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011789 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011794 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011798 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011803 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011807 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011811 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011815 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.012686 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011820 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011824 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011828 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011833 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011838 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011842 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011847 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011851 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011855 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011859 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011864 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011868 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011872 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011877 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011881 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011886 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011890 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011894 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011898 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011902 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.013213 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011909 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011913 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011918 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011923 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011927 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011931 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011936 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011940 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011944 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011948 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011952 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011959 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011966 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011971 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011975 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011981 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011987 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011992 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.011997 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.013866 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012002 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012006 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012010 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012015 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012019 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012023 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012027 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012032 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012036 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012040 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012045 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012049 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012053 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012059 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012065 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012069 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012074 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012078 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012082 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.014435 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012086 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012091 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012095 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012099 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012103 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012107 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012111 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012141 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012147 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012152 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012157 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012161 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012166 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012170 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012174 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012179 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.015240 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012183 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.012191 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012354 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012363 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012370 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012378 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012383 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012388 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012393 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012397 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012402 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012407 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012412 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012417 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012421 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.015885 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012425 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012429 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012433 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012437 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012442 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012447 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012451 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012455 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012460 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012464 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012468 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012472 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012477 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012481 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012485 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012490 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012494 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012498 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012502 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012507 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.016269 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012511 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012515 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012520 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012524 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012528 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012549 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012554 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012559 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012563 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012568 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012577 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012582 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012586 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012590 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012595 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012599 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012603 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012607 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012611 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012615 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.016918 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012620 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012624 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012629 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012633 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012637 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012641 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012645 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012649 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012653 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012658 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012662 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012667 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012671 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012675 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012680 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012683 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012687 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012691 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012695 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.017416 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012699 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012706 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012711 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012716 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012723 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012728 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012733 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012739 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012744 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012749 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012753 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012758 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012762 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:42.012767 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.012775 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.017986 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.013621 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:42.018373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.016503 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:42.018373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.017436 2573 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:42.018373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.017530 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:42.018373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.017585 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:42.046150 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.046118 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:42.050276 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.050243 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:42.063768 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.063744 2573 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:42.072103 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.072081 2573 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:42.075464 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.075445 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:42.077773 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.077754 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:42.079695 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.079672 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7bebb665-643a-4908-a09a-223eaf44d59c:/dev/nvme0n1p4 ced09206-c083-478d-9f87-50deeea72d52:/dev/nvme0n1p3] Apr 21 10:03:42.079779 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.079693 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:42.085910 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.085805 2573 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:42.083416192 +0000 UTC m=+0.454216892 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099786 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b460cfb4019900dbc05ebee721347 SystemUUID:ec2b460c-fb40-1990-0dbc-05ebee721347 BootID:67e50ff4-093b-4d0a-a96f-34bcab39b0bb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:6a:d9:0a:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:6a:d9:0a:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:28:ec:6b:0b:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:42.085910 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.085905 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:42.086022 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.085991 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:42.087089 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.087062 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:42.087240 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.087090 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:42.087288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.087251 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:42.087288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.087260 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:42.087288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.087273 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:42.088172 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.088162 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:42.089967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.089955 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:42.090082 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.090073 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:42.092208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.092198 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:42.092252 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.092212 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:42.092252 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.092244 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:42.092304 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.092254 2573 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:42.092304 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.092274 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:42.093316 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.093302 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:42.093364 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.093324 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:42.097388 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.097367 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:42.098961 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.098948 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:42.105580 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105555 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105587 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105599 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105608 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105619 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105636 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105646 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105654 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105665 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105674 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105687 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:42.105693 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.105700 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:42.106638 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.106617 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:42.106721 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.106664 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:42.108938 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.108901 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:42.108938 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.108901 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:42.110666 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.110651 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:42.110756 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.110750 2573 server.go:1295] "Started kubelet" Apr 21 10:03:42.110908 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.110871 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:42.111003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.110885 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:42.111003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.110932 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:42.111682 ip-10-0-140-234 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:42.112204 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.112158 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:42.113702 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.113687 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:42.118920 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.118899 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:42.119433 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.119415 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:42.120441 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120420 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:42.120441 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120422 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:42.120624 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.120437 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.120624 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120450 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:42.120624 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120616 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:42.120748 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120628 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:42.120748 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.120663 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-234.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 10:03:42.121962 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.120455 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-234.ec2.internal.18a85719fbf60921 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-234.ec2.internal,UID:ip-10-0-140-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-234.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.110664993 +0000 UTC m=+0.481465701,LastTimestamp:2026-04-21 10:03:42.110664993 +0000 UTC m=+0.481465701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-234.ec2.internal,}" Apr 21 10:03:42.122552 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122514 2573 factory.go:55] Registering systemd factory Apr 21 10:03:42.122633 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122556 2573 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:42.122825 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122807 2573 factory.go:153] Registering CRI-O factory Apr 21 10:03:42.122825 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122827 2573 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:42.122983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122878 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:42.122983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122908 2573 factory.go:103] Registering Raw factory Apr 21 10:03:42.122983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.122923 2573 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:42.123335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.123318 2573 manager.go:319] Starting recovery of all containers Apr 21 10:03:42.124188 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.124163 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:42.125005 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.124976 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 10:03:42.125115 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.125087 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gchgt" Apr 21 10:03:42.125187 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.125150 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 10:03:42.132007 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.131985 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gchgt" Apr 21 10:03:42.133279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.133261 2573 manager.go:324] Recovery completed Apr 21 10:03:42.138743 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.138729 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.141516 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.141500 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.141631 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.141525 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.141631 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.141549 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.142058 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.142043 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:42.142058 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.142056 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:42.142160 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.142072 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:42.143755 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.143691 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-234.ec2.internal.18a85719fdccbcc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-234.ec2.internal,UID:ip-10-0-140-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-234.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-234.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.141512903 +0000 UTC m=+0.512313603,LastTimestamp:2026-04-21 10:03:42.141512903 +0000 UTC m=+0.512313603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-234.ec2.internal,}" Apr 21 10:03:42.144420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.144406 2573 policy_none.go:49] "None policy: Start" Apr 21 10:03:42.144478 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.144423 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:42.144478 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.144434 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.185558 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.186643 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.186666 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.186687 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.186695 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.186779 2573 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 21 10:03:42.189836 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.189635 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:42.190206 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190194 2573 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:42.190254 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.190229 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:42.190254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190242 2573 server.go:85] "Starting device plugin registration server" Apr 21 10:03:42.190528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190516 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:42.190614 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190530 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:42.190691 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190666 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:42.190807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190761 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:42.190807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.190771 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:42.191409 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.191366 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:42.191474 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.191412 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.287019 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.286912 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal"] Apr 21 10:03:42.287019 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.287017 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.288047 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.288030 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.288158 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.288060 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.288158 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.288076 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.289357 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.289344 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.289483 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.289465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.289557 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.289499 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.290028 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290011 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.290111 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290043 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.290111 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290054 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.290208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290118 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.290208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290144 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.290208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290158 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.290931 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.290917 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.291128 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291108 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.291208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291144 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.291677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291662 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291676 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291687 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291696 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291698 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291712 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.291742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.291731 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.299898 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.299882 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.299984 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.299906 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-234.ec2.internal\": node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.310436 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.310418 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.314117 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.314100 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-234.ec2.internal\" not found" node="ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.318585 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.318568 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-234.ec2.internal\" not found" node="ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.321145 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.321129 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cbe835788d88cf79d40a6c28376b21d-config\") pod \"kube-apiserver-proxy-ip-10-0-140-234.ec2.internal\" (UID: \"0cbe835788d88cf79d40a6c28376b21d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.321191 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.321154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.321191 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.321173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.410988 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.410956 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.421354 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cbe835788d88cf79d40a6c28376b21d-config\") pod \"kube-apiserver-proxy-ip-10-0-140-234.ec2.internal\" (UID: \"0cbe835788d88cf79d40a6c28376b21d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.421354 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cbe835788d88cf79d40a6c28376b21d-config\") pod \"kube-apiserver-proxy-ip-10-0-140-234.ec2.internal\" (UID: \"0cbe835788d88cf79d40a6c28376b21d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.421528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.421528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.421528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.421528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.421433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27525629a2a68e78e34b2a6b2dc5fc66-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal\" (UID: \"27525629a2a68e78e34b2a6b2dc5fc66\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.511467 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.511425 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.612198 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.612127 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.618335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.618309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.621935 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.621919 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:42.712975 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.712775 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.813334 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.813299 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.913903 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:42.913798 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:42.967694 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:42.967665 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:43.014757 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:43.014721 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:43.017892 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.017873 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:43.018026 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.018009 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:43.018085 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.018036 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:43.115044 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:43.115012 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:43.119191 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.119166 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:43.135051 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.135017 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:42 +0000 UTC" deadline="2028-01-24 20:38:22.759818938 +0000 UTC" Apr 21 10:03:43.135051 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.135054 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15442h34m39.624777555s" Apr 21 10:03:43.143140 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.143117 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:43.148521 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:43.148489 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27525629a2a68e78e34b2a6b2dc5fc66.slice/crio-7a2f9f1c6cb85122a1148f6c21299a3900ec833768b6300c6756fd403b44de70 WatchSource:0}: Error finding container 7a2f9f1c6cb85122a1148f6c21299a3900ec833768b6300c6756fd403b44de70: Status 404 returned error can't find the container with id 7a2f9f1c6cb85122a1148f6c21299a3900ec833768b6300c6756fd403b44de70 Apr 21 10:03:43.150833 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:43.150813 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbe835788d88cf79d40a6c28376b21d.slice/crio-5cb7ef566f23b77cad39580f0987d9f724227828986bc467952839a8846a5c76 WatchSource:0}: Error finding container 5cb7ef566f23b77cad39580f0987d9f724227828986bc467952839a8846a5c76: Status 404 returned error can't find the container with id 5cb7ef566f23b77cad39580f0987d9f724227828986bc467952839a8846a5c76 Apr 21 10:03:43.154635 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.154618 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:43.177953 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.177851 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xrl2j" Apr 21 10:03:43.184053 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.184034 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xrl2j" Apr 21 10:03:43.189342 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.189285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" event={"ID":"0cbe835788d88cf79d40a6c28376b21d","Type":"ContainerStarted","Data":"5cb7ef566f23b77cad39580f0987d9f724227828986bc467952839a8846a5c76"} Apr 21 10:03:43.190236 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.190213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" event={"ID":"27525629a2a68e78e34b2a6b2dc5fc66","Type":"ContainerStarted","Data":"7a2f9f1c6cb85122a1148f6c21299a3900ec833768b6300c6756fd403b44de70"} Apr 21 10:03:43.215486 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:43.215453 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:43.316028 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:43.315995 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-234.ec2.internal\" not found" Apr 21 10:03:43.380115 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.380085 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:43.420149 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.420123 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" Apr 21 10:03:43.431947 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.431892 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:43.432788 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.432775 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" Apr 21 10:03:43.443547 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.443509 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:43.646416 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:43.646384 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:44.022611 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.022463 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:44.093153 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.093117 2573 apiserver.go:52] "Watching apiserver" Apr 21 10:03:44.100993 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.100964 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:44.101351 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.101329 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-27nvt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j","openshift-cluster-node-tuning-operator/tuned-9j9bx","openshift-image-registry/node-ca-ffsc5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal","openshift-multus/network-metrics-daemon-g9jwm","openshift-network-diagnostics/network-check-target-9n5rr","openshift-network-operator/iptables-alerter-wg6rg","kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal","openshift-dns/node-resolver-594w4","openshift-multus/multus-27bm5","openshift-multus/multus-additional-cni-plugins-hzkgn","openshift-ovn-kubernetes/ovnkube-node-nz95q"] Apr 21 10:03:44.103804 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.103784 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:44.103907 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.103865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:44.105231 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.105211 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.106978 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.106953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.107592 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.107572 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qh762\"" Apr 21 10:03:44.107684 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.107612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.107684 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.107610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:44.107804 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.107652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.108313 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.108295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.109243 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.109224 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.109335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.109263 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.109397 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.109231 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ghbqc\"" Apr 21 10:03:44.109973 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.109955 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.110066 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.110021 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:44.110386 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.110360 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:44.110502 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.110420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7ql\"" Apr 21 10:03:44.110643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.110575 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.110712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.110648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.111364 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.111342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.112873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.112854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.113938 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.113763 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:44.113938 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.113849 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9xwmk\"" Apr 21 10:03:44.113938 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.113858 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:44.115109 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.115090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.115234 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.115215 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.115524 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.115506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:44.115640 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.115511 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.115765 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.115748 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8b4v4\"" Apr 21 10:03:44.116585 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.116569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.117408 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.117387 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.117722 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.117705 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.117929 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.117915 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q6xb\"" Apr 21 10:03:44.119037 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.118826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.119469 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.119909 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-whmjl\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.119935 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.119972 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.120557 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.121306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:44.121751 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.121727 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:44.122167 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.121970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-57929\"" Apr 21 10:03:44.124382 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.123222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.125777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.125754 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.125870 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.125759 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.126132 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.126112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5wfdc\"" Apr 21 10:03:44.127820 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.127637 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:44.127820 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.127648 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:44.127820 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.127758 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:44.128025 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.127845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:44.130960 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.130937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.131056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.130975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-os-release\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.131056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0622ef89-a9c2-4672-891f-4e52ebb096b4-hosts-file\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.131056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-kubelet\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-hostroot\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.131250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:44.131250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cnibin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-netns\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-netd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131301 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovn-node-metrics-cert\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597f4\" (UniqueName: \"kubernetes.io/projected/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-kube-api-access-597f4\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.131425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-iptables-alerter-script\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.131425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78l9\" (UniqueName: \"kubernetes.io/projected/0622ef89-a9c2-4672-891f-4e52ebb096b4-kube-api-access-l78l9\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-etc-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-agent-certs\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84j7\" (UniqueName: \"kubernetes.io/projected/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-kube-api-access-b84j7\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-system-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cnibin\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-kubelet\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-netns\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-var-lib-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk65l\" (UniqueName: \"kubernetes.io/projected/a931daa8-594d-442d-b462-5f77532314a5-kube-api-access-gk65l\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-daemon-config\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv4px\" (UniqueName: \"kubernetes.io/projected/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-kube-api-access-jv4px\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-modprobe-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-systemd\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.131873 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-var-lib-kubelet\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-etc-kubernetes\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-socket-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.131960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-registration-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysconfig\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-kubernetes\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-systemd-units\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-ovn\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-lib-modules\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.132235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-os-release\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-sys-fs\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-log-socket\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swr4c\" (UniqueName: \"kubernetes.io/projected/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-kube-api-access-swr4c\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-slash\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-systemd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-tmp\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-conf-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132516 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-device-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9feaa36-784e-406f-b11b-9f103755a6a0-serviceca\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-konnectivity-ca\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.132642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132640 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-k8s-cni-cncf-io\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-bin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-bin\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-tuned\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-host-slash\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cni-binary-copy\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqg4\" (UniqueName: \"kubernetes.io/projected/b9feaa36-784e-406f-b11b-9f103755a6a0-kube-api-access-rxqg4\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-conf\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-socket-dir-parent\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-system-cni-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.132983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-node-log\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-env-overrides\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-run\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.133139 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0622ef89-a9c2-4672-891f-4e52ebb096b4-tmp-dir\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-multus-certs\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-script-lib\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4d2p\" (UniqueName: \"kubernetes.io/projected/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-kube-api-access-w4d2p\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-sys\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133253 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-host\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-multus\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nf8n\" (UniqueName: \"kubernetes.io/projected/6e9afdbf-0e74-4924-ab98-9859003a83c5-kube-api-access-7nf8n\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-config\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.133832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.133405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9feaa36-784e-406f-b11b-9f103755a6a0-host\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.186110 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.186068 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:43 +0000 UTC" deadline="2027-12-19 05:19:41.550676556 +0000 UTC" Apr 21 10:03:44.186110 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.186104 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14563h15m57.364576576s" Apr 21 10:03:44.221373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.221331 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:44.234242 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-log-socket\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234423 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swr4c\" (UniqueName: \"kubernetes.io/projected/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-kube-api-access-swr4c\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234423 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.234423 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-etc-selinux\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.234423 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-log-socket\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-slash\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-systemd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-tmp\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-slash\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-systemd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-conf-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-conf-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-device-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9feaa36-784e-406f-b11b-9f103755a6a0-serviceca\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-device-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-konnectivity-ca\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-k8s-cni-cncf-io\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-bin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-bin\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-tuned\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-k8s-cni-cncf-io\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-host-slash\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cni-binary-copy\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-bin\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.234918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqg4\" (UniqueName: \"kubernetes.io/projected/b9feaa36-784e-406f-b11b-9f103755a6a0-kube-api-access-rxqg4\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-bin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-conf\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.234995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-socket-dir-parent\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-system-cni-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235093 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9feaa36-784e-406f-b11b-9f103755a6a0-serviceca\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.235259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-socket-dir-parent\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.235561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.235561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-system-cni-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.235561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-conf\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.235561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-konnectivity-ca\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-node-log\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-env-overrides\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-run\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.235777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0622ef89-a9c2-4672-891f-4e52ebb096b4-tmp-dir\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.236050 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-multus-certs\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.236050 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.236050 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.235834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.236294 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-script-lib\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.236349 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4d2p\" (UniqueName: \"kubernetes.io/projected/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-kube-api-access-w4d2p\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.236402 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-node-log\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.236464 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-run\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.236510 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-sys\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.236510 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-sys\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.236620 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-host\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.236620 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-multus\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.236620 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nf8n\" (UniqueName: \"kubernetes.io/projected/6e9afdbf-0e74-4924-ab98-9859003a83c5-kube-api-access-7nf8n\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.236754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-config\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.236754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.236754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9feaa36-784e-406f-b11b-9f103755a6a0-host\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.236884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.236884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0622ef89-a9c2-4672-891f-4e52ebb096b4-tmp-dir\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.236971 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-multus-certs\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.236971 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.236912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysctl-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.237062 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.237022 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.237225 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.237122 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:44.737079073 +0000 UTC m=+3.107879780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.237225 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-host-slash\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.237357 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.237357 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cni-binary-copy\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.237452 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-host\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.237452 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-cni-multus\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.237558 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.237647 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.237627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9feaa36-784e-406f-b11b-9f103755a6a0-host\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-os-release\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0622ef89-a9c2-4672-891f-4e52ebb096b4-hosts-file\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-kubelet\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-hostroot\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-os-release\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-hostroot\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0622ef89-a9c2-4672-891f-4e52ebb096b4-hosts-file\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.238957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cnibin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-var-lib-kubelet\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-cnibin\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.238957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-tuned\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.238957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.238830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-netns\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-netd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovn-node-metrics-cert\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-config\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-597f4\" (UniqueName: \"kubernetes.io/projected/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-kube-api-access-597f4\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.239203 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-tmp\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-cni-netd\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-host-run-netns\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-iptables-alerter-script\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l78l9\" (UniqueName: \"kubernetes.io/projected/0622ef89-a9c2-4672-891f-4e52ebb096b4-kube-api-access-l78l9\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-etc-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-agent-certs\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b84j7\" (UniqueName: \"kubernetes.io/projected/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-kube-api-access-b84j7\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-system-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.239457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cnibin\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-kubelet\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-netns\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovnkube-script-lib\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-env-overrides\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-var-lib-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk65l\" (UniqueName: \"kubernetes.io/projected/a931daa8-594d-442d-b462-5f77532314a5-kube-api-access-gk65l\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-cnibin\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-daemon-config\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-iptables-alerter-script\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv4px\" (UniqueName: \"kubernetes.io/projected/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-kube-api-access-jv4px\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-var-lib-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-etc-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-openvswitch\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.239883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-system-cni-dir\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-kubelet\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-modprobe-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-systemd\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.239998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-var-lib-kubelet\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-etc-kubernetes\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-socket-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-systemd\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-netns\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-registration-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysconfig\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-kubernetes\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-systemd-units\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-multus-daemon-config\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-ovn\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-registration-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.240567 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-lib-modules\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-os-release\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-lib-modules\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-sys-fs\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-sysconfig\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-kubernetes\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-sys-fs\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-var-lib-kubelet\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-run-ovn\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e9afdbf-0e74-4924-ab98-9859003a83c5-socket-dir\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-etc-kubernetes\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-os-release\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.240982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-etc-modprobe-d\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.241046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-host-run-ovn-kubernetes\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.241315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.241177 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-systemd-units\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.242079 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.241335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.244299 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.244228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/736b14db-747c-43b4-bbb6-55ccc6a8a3d8-agent-certs\") pod \"konnectivity-agent-27nvt\" (UID: \"736b14db-747c-43b4-bbb6-55ccc6a8a3d8\") " pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.244886 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.244863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-ovn-node-metrics-cert\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.253938 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.253907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swr4c\" (UniqueName: \"kubernetes.io/projected/5d0fa637-dd7c-4b7c-b273-afeb822c11b6-kube-api-access-swr4c\") pod \"multus-27bm5\" (UID: \"5d0fa637-dd7c-4b7c-b273-afeb822c11b6\") " pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.258961 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.258930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4d2p\" (UniqueName: \"kubernetes.io/projected/c6a94c5c-b7cd-4e43-9d1e-59ac152bc150-kube-api-access-w4d2p\") pod \"ovnkube-node-nz95q\" (UID: \"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150\") " pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.259103 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.259025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-597f4\" (UniqueName: \"kubernetes.io/projected/0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b-kube-api-access-597f4\") pod \"tuned-9j9bx\" (UID: \"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b\") " pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.259148 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.259131 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:44.259199 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.259150 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:44.259199 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.259161 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:44.259267 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.259228 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:03:44.759207553 +0000 UTC m=+3.130008243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:44.262186 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.262100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nf8n\" (UniqueName: \"kubernetes.io/projected/6e9afdbf-0e74-4924-ab98-9859003a83c5-kube-api-access-7nf8n\") pod \"aws-ebs-csi-driver-node-cjc4j\" (UID: \"6e9afdbf-0e74-4924-ab98-9859003a83c5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.262309 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.262213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqg4\" (UniqueName: \"kubernetes.io/projected/b9feaa36-784e-406f-b11b-9f103755a6a0-kube-api-access-rxqg4\") pod \"node-ca-ffsc5\" (UID: \"b9feaa36-784e-406f-b11b-9f103755a6a0\") " pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.262309 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.262245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk65l\" (UniqueName: \"kubernetes.io/projected/a931daa8-594d-442d-b462-5f77532314a5-kube-api-access-gk65l\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.262706 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.262667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv4px\" (UniqueName: \"kubernetes.io/projected/5ee61994-bf42-4dfb-8334-fb990a0f5d8f-kube-api-access-jv4px\") pod \"multus-additional-cni-plugins-hzkgn\" (UID: \"5ee61994-bf42-4dfb-8334-fb990a0f5d8f\") " pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.263841 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.263823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78l9\" (UniqueName: \"kubernetes.io/projected/0622ef89-a9c2-4672-891f-4e52ebb096b4-kube-api-access-l78l9\") pod \"node-resolver-594w4\" (UID: \"0622ef89-a9c2-4672-891f-4e52ebb096b4\") " pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.263921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.263859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b84j7\" (UniqueName: \"kubernetes.io/projected/d93f69ab-ab85-442f-bac7-c3bcf5b11b8e-kube-api-access-b84j7\") pod \"iptables-alerter-wg6rg\" (UID: \"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e\") " pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.417545 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.417440 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" Apr 21 10:03:44.427570 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.427527 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" Apr 21 10:03:44.435133 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.435101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ffsc5" Apr 21 10:03:44.440776 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.440749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:03:44.447504 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.447480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wg6rg" Apr 21 10:03:44.454087 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.454064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-594w4" Apr 21 10:03:44.460743 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.460723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-27bm5" Apr 21 10:03:44.467415 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.467391 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" Apr 21 10:03:44.472055 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.472037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:03:44.744383 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.744311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:44.744523 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.744460 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.744588 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.744526 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:45.744511963 +0000 UTC m=+4.115312653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.786408 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.786307 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0622ef89_a9c2_4672_891f_4e52ebb096b4.slice/crio-6cbc7b165ad15a9950a0d97d3cd365bfff5107dcbccd30b0f588287edaa7677b WatchSource:0}: Error finding container 6cbc7b165ad15a9950a0d97d3cd365bfff5107dcbccd30b0f588287edaa7677b: Status 404 returned error can't find the container with id 6cbc7b165ad15a9950a0d97d3cd365bfff5107dcbccd30b0f588287edaa7677b Apr 21 10:03:44.787953 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.787923 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee61994_bf42_4dfb_8334_fb990a0f5d8f.slice/crio-32044567c5a23bbe0764ff56099d2249a057ccee98dfea3ed72c6d186bf45f4e WatchSource:0}: Error finding container 32044567c5a23bbe0764ff56099d2249a057ccee98dfea3ed72c6d186bf45f4e: Status 404 returned error can't find the container with id 32044567c5a23bbe0764ff56099d2249a057ccee98dfea3ed72c6d186bf45f4e Apr 21 10:03:44.791688 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.791667 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d0fa637_dd7c_4b7c_b273_afeb822c11b6.slice/crio-1302170ce61498842c21bace28ae79e3be785fa4415d9d57b451a41288055584 WatchSource:0}: Error finding container 1302170ce61498842c21bace28ae79e3be785fa4415d9d57b451a41288055584: Status 404 returned error can't find the container with id 1302170ce61498842c21bace28ae79e3be785fa4415d9d57b451a41288055584 Apr 21 10:03:44.792478 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.792459 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93f69ab_ab85_442f_bac7_c3bcf5b11b8e.slice/crio-73c3e468e3f2e231b22d39713eace0498bce0c712fde047d0ea5a8c28e26e7db WatchSource:0}: Error finding container 73c3e468e3f2e231b22d39713eace0498bce0c712fde047d0ea5a8c28e26e7db: Status 404 returned error can't find the container with id 73c3e468e3f2e231b22d39713eace0498bce0c712fde047d0ea5a8c28e26e7db Apr 21 10:03:44.793958 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.793940 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod736b14db_747c_43b4_bbb6_55ccc6a8a3d8.slice/crio-7f5157cde6f25d0394a8d3f1b782913a9781f2d680bfc4cfc885761d5ffb5826 WatchSource:0}: Error finding container 7f5157cde6f25d0394a8d3f1b782913a9781f2d680bfc4cfc885761d5ffb5826: Status 404 returned error can't find the container with id 7f5157cde6f25d0394a8d3f1b782913a9781f2d680bfc4cfc885761d5ffb5826 Apr 21 10:03:44.794905 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.794884 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9feaa36_784e_406f_b11b_9f103755a6a0.slice/crio-877ca58ee9cae381db492d84b5acbd53ece031ad6dd0cfcd43c227d6abeb6860 WatchSource:0}: Error finding container 877ca58ee9cae381db492d84b5acbd53ece031ad6dd0cfcd43c227d6abeb6860: Status 404 returned error can't find the container with id 877ca58ee9cae381db492d84b5acbd53ece031ad6dd0cfcd43c227d6abeb6860 Apr 21 10:03:44.796226 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.796058 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9afdbf_0e74_4924_ab98_9859003a83c5.slice/crio-f82fe9ec9fb5c86f21f602594816063067dec852881409e664f5d9aa0866c466 WatchSource:0}: Error finding container f82fe9ec9fb5c86f21f602594816063067dec852881409e664f5d9aa0866c466: Status 404 returned error can't find the container with id f82fe9ec9fb5c86f21f602594816063067dec852881409e664f5d9aa0866c466 Apr 21 10:03:44.796753 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.796654 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e4294b2_e56e_4ceb_bbc1_4ab1d38dc27b.slice/crio-c745aecf8ed7aa7aed5f8a1e729c6c6664a3717fbc181bb82a0e7d55bc3de2ae WatchSource:0}: Error finding container c745aecf8ed7aa7aed5f8a1e729c6c6664a3717fbc181bb82a0e7d55bc3de2ae: Status 404 returned error can't find the container with id c745aecf8ed7aa7aed5f8a1e729c6c6664a3717fbc181bb82a0e7d55bc3de2ae Apr 21 10:03:44.797617 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:03:44.797597 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a94c5c_b7cd_4e43_9d1e_59ac152bc150.slice/crio-c4fbffeb42702dfb936e7d473cef104e3da4aa70751c3cb9b3245b46f078e29a WatchSource:0}: Error finding container c4fbffeb42702dfb936e7d473cef104e3da4aa70751c3cb9b3245b46f078e29a: Status 404 returned error can't find the container with id c4fbffeb42702dfb936e7d473cef104e3da4aa70751c3cb9b3245b46f078e29a Apr 21 10:03:44.844967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:44.844934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:44.845106 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.845070 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:44.845106 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.845085 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:44.845106 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.845093 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:44.845208 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:44.845136 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:03:45.845124468 +0000 UTC m=+4.215925155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:45.186982 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.186752 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:43 +0000 UTC" deadline="2027-09-29 06:35:05.972287062 +0000 UTC" Apr 21 10:03:45.186982 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.186797 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12620h31m20.785494259s" Apr 21 10:03:45.186982 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.186905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:45.187636 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.187027 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:45.198519 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.198447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"c4fbffeb42702dfb936e7d473cef104e3da4aa70751c3cb9b3245b46f078e29a"} Apr 21 10:03:45.199704 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.199672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" event={"ID":"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b","Type":"ContainerStarted","Data":"c745aecf8ed7aa7aed5f8a1e729c6c6664a3717fbc181bb82a0e7d55bc3de2ae"} Apr 21 10:03:45.202427 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.202398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" event={"ID":"6e9afdbf-0e74-4924-ab98-9859003a83c5","Type":"ContainerStarted","Data":"f82fe9ec9fb5c86f21f602594816063067dec852881409e664f5d9aa0866c466"} Apr 21 10:03:45.204897 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.204868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffsc5" event={"ID":"b9feaa36-784e-406f-b11b-9f103755a6a0","Type":"ContainerStarted","Data":"877ca58ee9cae381db492d84b5acbd53ece031ad6dd0cfcd43c227d6abeb6860"} Apr 21 10:03:45.210862 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.210833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-27nvt" event={"ID":"736b14db-747c-43b4-bbb6-55ccc6a8a3d8","Type":"ContainerStarted","Data":"7f5157cde6f25d0394a8d3f1b782913a9781f2d680bfc4cfc885761d5ffb5826"} Apr 21 10:03:45.224698 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.224648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wg6rg" event={"ID":"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e","Type":"ContainerStarted","Data":"73c3e468e3f2e231b22d39713eace0498bce0c712fde047d0ea5a8c28e26e7db"} Apr 21 10:03:45.229793 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.229754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-27bm5" event={"ID":"5d0fa637-dd7c-4b7c-b273-afeb822c11b6","Type":"ContainerStarted","Data":"1302170ce61498842c21bace28ae79e3be785fa4415d9d57b451a41288055584"} Apr 21 10:03:45.239946 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.239872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerStarted","Data":"32044567c5a23bbe0764ff56099d2249a057ccee98dfea3ed72c6d186bf45f4e"} Apr 21 10:03:45.242483 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.242381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-594w4" event={"ID":"0622ef89-a9c2-4672-891f-4e52ebb096b4","Type":"ContainerStarted","Data":"6cbc7b165ad15a9950a0d97d3cd365bfff5107dcbccd30b0f588287edaa7677b"} Apr 21 10:03:45.250878 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.250069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" event={"ID":"0cbe835788d88cf79d40a6c28376b21d","Type":"ContainerStarted","Data":"dc075b394d218e8f58b4cc167f0260d55b62ed8d96ad744bdccbe23489fb4fda"} Apr 21 10:03:45.265054 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.264634 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-234.ec2.internal" podStartSLOduration=2.264593878 podStartE2EDuration="2.264593878s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:45.26400194 +0000 UTC m=+3.634802653" watchObservedRunningTime="2026-04-21 10:03:45.264593878 +0000 UTC m=+3.635394587" Apr 21 10:03:45.653107 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.652681 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:45.752572 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.751670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:45.752572 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.751998 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:45.752572 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.752063 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:47.752043836 +0000 UTC m=+6.122844536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:45.853208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:45.852633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:45.853208 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.852796 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:45.853208 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.852817 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:45.853208 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.852829 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:45.853208 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:45.852886 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:03:47.852866865 +0000 UTC m=+6.223667566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:46.189362 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.189329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:46.189828 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:46.189452 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:46.268418 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.268356 2573 generic.go:358] "Generic (PLEG): container finished" podID="27525629a2a68e78e34b2a6b2dc5fc66" containerID="e330388b4591100d02e8f90673c3ba3c99f912be18d89ccf510ca2d1c7bc3d69" exitCode=0 Apr 21 10:03:46.268595 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.268506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" event={"ID":"27525629a2a68e78e34b2a6b2dc5fc66","Type":"ContainerDied","Data":"e330388b4591100d02e8f90673c3ba3c99f912be18d89ccf510ca2d1c7bc3d69"} Apr 21 10:03:46.438438 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.437678 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zjd9r"] Apr 21 10:03:46.441168 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.440697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.441168 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:46.440773 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:46.558458 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.558422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-kubelet-config\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.558642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.558546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-dbus\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.558642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.558578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.658926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-dbus\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.659177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.659219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-kubelet-config\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.659309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-kubelet-config\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:46.659432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-dbus\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:46.659469 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:46.659921 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:46.659555 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:47.159513438 +0000 UTC m=+5.530314132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:47.164391 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:47.164320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:47.164620 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.164557 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:47.164705 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.164692 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:48.164669984 +0000 UTC m=+6.535470686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:47.188252 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:47.187720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:47.188252 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.187874 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:47.278586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:47.278260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" event={"ID":"27525629a2a68e78e34b2a6b2dc5fc66","Type":"ContainerStarted","Data":"6ace93b1c4b5f150110662d4f2581bb2dadb1d2b3941f2676177c9b67e59a532"} Apr 21 10:03:47.769342 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:47.769304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:47.769580 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.769522 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.769663 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.769609 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:51.769589578 +0000 UTC m=+10.140390279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:47.869767 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:47.869725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:47.869954 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.869934 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:47.869954 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.869955 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:47.870078 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.869970 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:47.870078 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:47.870029 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:03:51.870011144 +0000 UTC m=+10.240811853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:48.173169 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:48.172549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:48.173169 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:48.172710 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:48.173169 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:48.172772 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:50.172754186 +0000 UTC m=+8.543554889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:48.188561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:48.187095 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:48.188561 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:48.187244 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:48.188561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:48.187645 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:48.188561 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:48.187732 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:49.188017 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:49.187738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:49.188017 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:49.187889 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:50.187119 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:50.187090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:50.187320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:50.187091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:50.187320 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:50.187224 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:50.187453 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:50.187330 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:50.189288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:50.189259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:50.189726 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:50.189357 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:50.189726 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:50.189410 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.189392779 +0000 UTC m=+12.560193483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:51.187950 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:51.187915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:51.188142 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.188049 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:51.802280 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:51.802186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:51.802751 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.802377 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:51.802751 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.802459 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:59.80243716 +0000 UTC m=+18.173237862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:51.903147 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:51.903097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:51.903326 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.903265 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:51.903326 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.903290 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:51.903326 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.903303 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:51.903465 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:51.903373 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:03:59.90335164 +0000 UTC m=+18.274152350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:52.189121 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:52.188475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:52.189121 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:52.188606 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:52.189121 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:52.188659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:52.189121 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:52.188729 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:53.187300 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:53.187253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:53.187893 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:53.187406 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:54.187843 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:54.187726 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:54.188234 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:54.187860 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:54.188234 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:54.187909 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:54.188234 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:54.188024 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:54.220680 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:54.220642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:54.220850 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:54.220817 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.220902 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:54.220889 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:02.220871098 +0000 UTC m=+20.591671800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:55.186955 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:55.186923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:55.187137 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:55.187055 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:56.187455 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:56.187421 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:56.188044 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:56.187556 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:56.188044 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:56.187569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:56.188044 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:56.187693 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:57.186973 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:57.186934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:57.187162 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:57.187048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:58.186957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:58.186921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:58.186957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:58.186960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:03:58.187569 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:58.187059 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:03:58.187569 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:58.187216 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:03:59.187188 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:59.187143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:59.187610 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.187309 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:03:59.859895 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:59.859846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:03:59.860120 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.860028 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:59.860120 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.860114 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:15.860090521 +0000 UTC m=+34.230891231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:59.960658 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:03:59.960623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:03:59.960856 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.960831 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:59.960856 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.960855 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:59.960962 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.960870 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:59.960962 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:03:59.960932 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:04:15.960916606 +0000 UTC m=+34.331717294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:00.187335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:00.187245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:00.187761 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:00.187245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:00.187761 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:00.187386 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:00.187761 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:00.187470 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:01.187705 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:01.187654 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:01.188219 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:01.187813 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:02.188642 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.188608 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:02.189306 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:02.188736 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:02.189306 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.188783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:02.189306 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:02.188898 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:02.280445 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.280352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:02.282120 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:02.280481 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:02.282120 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:02.280557 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret podName:b16969ed-ca80-4fde-b8cb-9e1cbd9d131d nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.28051491 +0000 UTC m=+36.651315614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret") pod "global-pull-secret-syncer-zjd9r" (UID: "b16969ed-ca80-4fde-b8cb-9e1cbd9d131d") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:02.306996 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.306936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" event={"ID":"0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b","Type":"ContainerStarted","Data":"4f3d873ebb9b61b4180d43c52851849a909f749400ccb804aff1be4f49ba87a2"} Apr 21 10:04:02.337288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.336130 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9j9bx" podStartSLOduration=3.059742488 podStartE2EDuration="20.336111329s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.799557225 +0000 UTC m=+3.170357912" lastFinishedPulling="2026-04-21 10:04:02.075926066 +0000 UTC m=+20.446726753" observedRunningTime="2026-04-21 10:04:02.335761989 +0000 UTC m=+20.706562699" watchObservedRunningTime="2026-04-21 10:04:02.336111329 +0000 UTC m=+20.706912039" Apr 21 10:04:02.337288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:02.336315 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-234.ec2.internal" podStartSLOduration=19.336309157 podStartE2EDuration="19.336309157s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:47.294859972 +0000 UTC m=+5.665660682" watchObservedRunningTime="2026-04-21 10:04:02.336309157 +0000 UTC m=+20.707109867" Apr 21 10:04:03.187314 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.187063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:03.187457 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:03.187353 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:03.311034 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.310997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" event={"ID":"6e9afdbf-0e74-4924-ab98-9859003a83c5","Type":"ContainerStarted","Data":"a4c7ac1a57872a661401a0545ac1ccb636f242117fe6321c8867513c0ec478e7"} Apr 21 10:04:03.312586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.312549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ffsc5" event={"ID":"b9feaa36-784e-406f-b11b-9f103755a6a0","Type":"ContainerStarted","Data":"0d1ffa635522a26307da19f453df489017d292aaec81cdfab32d9d89ae54cbc4"} Apr 21 10:04:03.314056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.314021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-27nvt" event={"ID":"736b14db-747c-43b4-bbb6-55ccc6a8a3d8","Type":"ContainerStarted","Data":"9574181e52cda0daa630713e2cb17f00356af362520529da16e77513fd0425da"} Apr 21 10:04:03.315605 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.315578 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-27bm5" event={"ID":"5d0fa637-dd7c-4b7c-b273-afeb822c11b6","Type":"ContainerStarted","Data":"fd130afd5a31d4a17ae7d33b9a54722ea96f56a920a8a5b36882b8ec7ca5d935"} Apr 21 10:04:03.317094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.317067 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="99f54c230b78cd13c4a460d666a6673a491ef72cabf4828e9dd053d380ba0527" exitCode=0 Apr 21 10:04:03.317211 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.317149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"99f54c230b78cd13c4a460d666a6673a491ef72cabf4828e9dd053d380ba0527"} Apr 21 10:04:03.318800 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.318777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-594w4" event={"ID":"0622ef89-a9c2-4672-891f-4e52ebb096b4","Type":"ContainerStarted","Data":"c4d4b0fd18142c993d265a117260e16f73dd4f730caa3a28468526f4ba8d3031"} Apr 21 10:04:03.321826 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.321809 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:04:03.322187 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322159 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6a94c5c-b7cd-4e43-9d1e-59ac152bc150" containerID="00f6f8df3660d2923c2d9ed7b558fac27030edddb101c3025f0b56a0e5926159" exitCode=1 Apr 21 10:04:03.322290 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322234 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"3351de8cd529d743c6ca7aaf24e84754cfe0e041de8680a790db51f3294d46a5"} Apr 21 10:04:03.322290 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"a0ce2ddaeeb050cc19ed9ed1a45357d7481eeef7f2c1d14303ae79a149b76096"} Apr 21 10:04:03.322290 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"cc72aec8d0be8658887d221f696609f79154f6f24fad32ced7f3e7b2239aa582"} Apr 21 10:04:03.322453 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"8f5cc2303cd0b615ca1cabc0e7839183c2e080b6deca42908bbdf1b28cdd8ed9"} Apr 21 10:04:03.322453 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerDied","Data":"00f6f8df3660d2923c2d9ed7b558fac27030edddb101c3025f0b56a0e5926159"} Apr 21 10:04:03.322453 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.322326 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"475c5259f1f7b54705ceff701762b0ff464c93b1970b235f3e60ab108bb29655"} Apr 21 10:04:03.326193 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.326142 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ffsc5" podStartSLOduration=4.047205507 podStartE2EDuration="21.326125234s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.79698621 +0000 UTC m=+3.167786902" lastFinishedPulling="2026-04-21 10:04:02.075905941 +0000 UTC m=+20.446706629" observedRunningTime="2026-04-21 10:04:03.325256906 +0000 UTC m=+21.696057615" watchObservedRunningTime="2026-04-21 10:04:03.326125234 +0000 UTC m=+21.696925946" Apr 21 10:04:03.364638 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.364557 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-27bm5" podStartSLOduration=4.034162511 podStartE2EDuration="21.364519689s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.793208265 +0000 UTC m=+3.164008954" lastFinishedPulling="2026-04-21 10:04:02.123565434 +0000 UTC m=+20.494366132" observedRunningTime="2026-04-21 10:04:03.364255001 +0000 UTC m=+21.735055709" watchObservedRunningTime="2026-04-21 10:04:03.364519689 +0000 UTC m=+21.735320427" Apr 21 10:04:03.377011 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.376967 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-27nvt" podStartSLOduration=4.096850738 podStartE2EDuration="21.376951624s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.795813692 +0000 UTC m=+3.166614395" lastFinishedPulling="2026-04-21 10:04:02.075914592 +0000 UTC m=+20.446715281" observedRunningTime="2026-04-21 10:04:03.376937167 +0000 UTC m=+21.747737889" watchObservedRunningTime="2026-04-21 10:04:03.376951624 +0000 UTC m=+21.747752332" Apr 21 10:04:03.842901 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:03.842857 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:04.187877 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.187799 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:04.188090 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:04.187931 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:04.188090 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.187989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:04.188207 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:04.188105 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:04.198799 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.198689 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:03.842886734Z","UUID":"1a0d65c0-e19b-42b8-a430-051b15a3d1d9","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:04.201315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.201292 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:04.201315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.201322 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:04.326217 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.326178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" event={"ID":"6e9afdbf-0e74-4924-ab98-9859003a83c5","Type":"ContainerStarted","Data":"51daaf48e6674b28dca4db6c2278417201d2b77d1a57f02201fe924f59ebc9fb"} Apr 21 10:04:04.327674 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.327625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wg6rg" event={"ID":"d93f69ab-ab85-442f-bac7-c3bcf5b11b8e","Type":"ContainerStarted","Data":"d46f96794e9b9400a0995dfb0f604260eadd845a36c56e1d0b6b40b83bb685b4"} Apr 21 10:04:04.341516 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.341457 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wg6rg" podStartSLOduration=5.059673239 podStartE2EDuration="22.341436567s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.794143319 +0000 UTC m=+3.164944006" lastFinishedPulling="2026-04-21 10:04:02.075906644 +0000 UTC m=+20.446707334" observedRunningTime="2026-04-21 10:04:04.341164234 +0000 UTC m=+22.711964942" watchObservedRunningTime="2026-04-21 10:04:04.341436567 +0000 UTC m=+22.712237277" Apr 21 10:04:04.341688 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:04.341607 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-594w4" podStartSLOduration=5.028972491 podStartE2EDuration="22.341600516s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.788596431 +0000 UTC m=+3.159397131" lastFinishedPulling="2026-04-21 10:04:02.101224457 +0000 UTC m=+20.472025156" observedRunningTime="2026-04-21 10:04:03.390157528 +0000 UTC m=+21.760958238" watchObservedRunningTime="2026-04-21 10:04:04.341600516 +0000 UTC m=+22.712401227" Apr 21 10:04:05.187943 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:05.187906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:05.188143 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:05.188048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:06.186936 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.186899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:06.187664 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.186899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:06.187664 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:06.187043 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:06.187664 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:06.187107 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:06.334926 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.334894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:04:06.335300 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.335274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"77738c5b62cb75c7d826b2761cf14fa11c7646198982b496639443a0e65963ef"} Apr 21 10:04:06.337133 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.337110 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" event={"ID":"6e9afdbf-0e74-4924-ab98-9859003a83c5","Type":"ContainerStarted","Data":"54ab22c13608f3e3158cae3a3e14c780e722650ef0d44e84d82a52fa9c327f71"} Apr 21 10:04:06.358287 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:06.358233 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cjc4j" podStartSLOduration=3.875928858 podStartE2EDuration="24.358218375s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.799842092 +0000 UTC m=+3.170642779" lastFinishedPulling="2026-04-21 10:04:05.282131602 +0000 UTC m=+23.652932296" observedRunningTime="2026-04-21 10:04:06.357509622 +0000 UTC m=+24.728310342" watchObservedRunningTime="2026-04-21 10:04:06.358218375 +0000 UTC m=+24.729019083" Apr 21 10:04:07.187459 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:07.187423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:07.188027 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:07.187580 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:07.685521 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:07.685326 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:04:07.686009 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:07.685990 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:04:08.187428 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.187393 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:08.187629 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.187396 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:08.187629 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:08.187523 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:08.187629 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:08.187574 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:08.342803 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.342772 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="88468daf2d1e60ecafedf939afb823e038a68b849d5ef4c012f6487d1614284e" exitCode=0 Apr 21 10:04:08.342965 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.342844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"88468daf2d1e60ecafedf939afb823e038a68b849d5ef4c012f6487d1614284e"} Apr 21 10:04:08.348484 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.348468 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:04:08.348845 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.348821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"594ce5c583123969e500e4f785140e259efdb748bcd474e7dd1a10d853465523"} Apr 21 10:04:08.349202 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.349160 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:04:08.349378 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.349364 2573 scope.go:117] "RemoveContainer" containerID="00f6f8df3660d2923c2d9ed7b558fac27030edddb101c3025f0b56a0e5926159" Apr 21 10:04:08.349586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:08.349572 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-27nvt" Apr 21 10:04:09.186905 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.186875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:09.187065 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:09.186993 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:09.352186 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.352150 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="8f0c401c0915971d3a4c24a9961f844d2a7f8189cd0f4118ce1362c001cf55c2" exitCode=0 Apr 21 10:04:09.352641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.352236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"8f0c401c0915971d3a4c24a9961f844d2a7f8189cd0f4118ce1362c001cf55c2"} Apr 21 10:04:09.355744 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.355563 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:04:09.356126 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.356104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" event={"ID":"c6a94c5c-b7cd-4e43-9d1e-59ac152bc150","Type":"ContainerStarted","Data":"604fd51b66cc6e81e28a11cbee99c3aa1fee7e401bec4f12abc98b5af7fbdc1e"} Apr 21 10:04:09.356335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.356316 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:09.356412 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.356347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:09.356412 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.356360 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:09.370386 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.370364 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:09.370578 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.370566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:09.401037 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.400987 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" podStartSLOduration=10.059863982 podStartE2EDuration="27.400974582s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.802191844 +0000 UTC m=+3.172992531" lastFinishedPulling="2026-04-21 10:04:02.143302431 +0000 UTC m=+20.514103131" observedRunningTime="2026-04-21 10:04:09.40048657 +0000 UTC m=+27.771287289" watchObservedRunningTime="2026-04-21 10:04:09.400974582 +0000 UTC m=+27.771775290" Apr 21 10:04:09.419319 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.419292 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zjd9r"] Apr 21 10:04:09.419442 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.419407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:09.419512 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:09.419495 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:09.422555 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.422514 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9n5rr"] Apr 21 10:04:09.422670 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.422660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:09.422783 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:09.422762 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:09.423087 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.423058 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g9jwm"] Apr 21 10:04:09.423167 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:09.423156 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:09.423280 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:09.423261 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:10.360074 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:10.359980 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="fb20ef42077ec090e2148c5b839af311fc6898c4973451038f330fd79ebc9d72" exitCode=0 Apr 21 10:04:10.360411 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:10.360071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"fb20ef42077ec090e2148c5b839af311fc6898c4973451038f330fd79ebc9d72"} Apr 21 10:04:11.187508 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:11.187466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:11.187508 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:11.187503 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:11.187749 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:11.187513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:11.187749 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:11.187614 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:11.187749 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:11.187680 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:11.187900 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:11.187761 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:13.187647 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:13.187612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:13.188384 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:13.187612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:13.188384 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:13.187748 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:13.188384 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:13.187612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:13.188384 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:13.187831 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:13.188384 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:13.187891 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:15.187097 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:15.187050 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:15.187737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:15.187131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:15.187737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:15.187162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:15.187737 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.187285 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:04:15.187737 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.187320 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zjd9r" podUID="b16969ed-ca80-4fde-b8cb-9e1cbd9d131d" Apr 21 10:04:15.187737 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.187422 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9n5rr" podUID="4f2983b7-be09-42ac-b5a7-0c43883354da" Apr 21 10:04:15.886285 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:15.886251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:15.886443 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.886420 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:15.886518 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.886506 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:47.886484395 +0000 UTC m=+66.257285096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:15.987362 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:15.987320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:15.987524 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.987489 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:15.987524 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.987512 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:15.987524 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.987522 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ldntl for pod openshift-network-diagnostics/network-check-target-9n5rr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:15.987654 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:15.987587 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl podName:4f2983b7-be09-42ac-b5a7-0c43883354da nodeName:}" failed. No retries permitted until 2026-04-21 10:04:47.987573151 +0000 UTC m=+66.358373838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ldntl" (UniqueName: "kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl") pod "network-check-target-9n5rr" (UID: "4f2983b7-be09-42ac-b5a7-0c43883354da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:16.408910 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.408735 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-234.ec2.internal" event="NodeReady" Apr 21 10:04:16.409223 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.409006 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:16.460757 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.460733 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz"] Apr 21 10:04:16.485468 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.485439 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d85fbdf48-88xdd"] Apr 21 10:04:16.485659 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.485630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.488658 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.488627 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 10:04:16.488843 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.488825 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-llcxp\"" Apr 21 10:04:16.489200 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.489181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 10:04:16.501832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.501799 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz"] Apr 21 10:04:16.501832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.501827 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d85fbdf48-88xdd"] Apr 21 10:04:16.501971 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.501933 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.503463 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.503440 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bqc7r"] Apr 21 10:04:16.506827 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.506807 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:04:16.506913 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.506814 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:04:16.507292 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.507273 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:04:16.508741 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.508726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7p24h\"" Apr 21 10:04:16.521413 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.521389 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:04:16.526283 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.526211 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.529247 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.529221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:16.529590 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.529559 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bqc7r"] Apr 21 10:04:16.532833 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.532809 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:16.532937 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.532821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:04:16.591839 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.591810 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cd55l"] Apr 21 10:04:16.592029 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592085 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.592085 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592164 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592164 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.592250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.592373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.592310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9h9n\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.605802 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.605777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:16.608689 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.608669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:16.608876 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.608852 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:16.609148 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.609134 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:16.609346 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.609329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:04:16.614911 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.614890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cd55l"] Apr 21 10:04:16.693683 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.693683 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.693884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.693884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjbh\" (UniqueName: \"kubernetes.io/projected/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-kube-api-access-fhjbh\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.693884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.693884 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ms49\" (UniqueName: \"kubernetes.io/projected/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-kube-api-access-2ms49\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:16.694003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9h9n\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.693975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-tmp-dir\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.694088 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694088 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:16.694166 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694166 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.694126 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:16.694166 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.694143 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:16.694166 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-config-volume\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.694194 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.194177751 +0000 UTC m=+35.564978439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.694293 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694343 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.694341 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.194328281 +0000 UTC m=+35.565128983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:16.694719 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.694804 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.694854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.694791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.695033 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.695016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:16.698196 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.698176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.698325 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.698200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.702923 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.702897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.703112 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.703093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9h9n\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:16.794864 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.794834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ms49\" (UniqueName: \"kubernetes.io/projected/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-kube-api-access-2ms49\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:16.795027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.794891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-tmp-dir\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.795027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.794919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:16.795027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.794933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-config-volume\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.795027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.794978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.795027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.795002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjbh\" (UniqueName: \"kubernetes.io/projected/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-kube-api-access-fhjbh\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.795272 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.795072 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:16.795272 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.795092 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:16.795272 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.795150 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.295126937 +0000 UTC m=+35.665927638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:16.795272 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:16.795195 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.295181404 +0000 UTC m=+35.665982100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:16.795272 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.795264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-tmp-dir\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.795496 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.795478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-config-volume\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.806374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.806344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjbh\" (UniqueName: \"kubernetes.io/projected/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-kube-api-access-fhjbh\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:16.806469 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:16.806425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ms49\" (UniqueName: \"kubernetes.io/projected/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-kube-api-access-2ms49\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:17.187936 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.187849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:17.187936 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.187874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:17.187936 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.187849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:17.190501 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:17.190649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:17.190649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190563 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w68bg\"" Apr 21 10:04:17.190649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190608 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:17.190649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:17.190805 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.190654 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:04:17.197652 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.197633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:17.197743 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.197683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:17.197807 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.197763 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:17.197846 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.197831 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.197816576 +0000 UTC m=+36.568617263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:17.197846 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.197765 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:17.197926 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.197855 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:17.197926 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.197888 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.197877711 +0000 UTC m=+36.568678414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.299212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.299350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.299508 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.299596 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.299577576 +0000 UTC m=+36.670378276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.299881 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:17.301420 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:17.299947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:18.299929535 +0000 UTC m=+36.670730228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:17.376364 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.376335 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="d72d3793c39e3cfd5f46d593231b909ff32b0cea8e178661ea3a5208a1d0228b" exitCode=0 Apr 21 10:04:17.376525 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:17.376395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"d72d3793c39e3cfd5f46d593231b909ff32b0cea8e178661ea3a5208a1d0228b"} Apr 21 10:04:18.205987 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.205949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.206001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.206114 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.206120 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.206130 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.206184 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:20.206168039 +0000 UTC m=+38.576968735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:18.206393 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.206199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:20.206193044 +0000 UTC m=+38.576993731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:18.306781 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.306743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:18.306963 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.306808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:18.306963 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.306847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:18.306963 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.306912 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:18.306963 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.306938 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:18.307158 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.306990 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:20.306970734 +0000 UTC m=+38.677771434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:18.307158 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:18.307006 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:20.306998494 +0000 UTC m=+38.677799181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:18.309425 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.309404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b16969ed-ca80-4fde-b8cb-9e1cbd9d131d-original-pull-secret\") pod \"global-pull-secret-syncer-zjd9r\" (UID: \"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d\") " pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:18.381226 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.381195 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ee61994-bf42-4dfb-8334-fb990a0f5d8f" containerID="4d9b1ec58e12d27197950f916662becf8af972a648193d7092d8f512927847f8" exitCode=0 Apr 21 10:04:18.381407 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.381262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerDied","Data":"4d9b1ec58e12d27197950f916662becf8af972a648193d7092d8f512927847f8"} Apr 21 10:04:18.398091 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.398069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjd9r" Apr 21 10:04:18.598710 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:18.598670 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zjd9r"] Apr 21 10:04:18.602804 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:04:18.602774 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16969ed_ca80_4fde_b8cb_9e1cbd9d131d.slice/crio-0dae0fcf0c5bd618f048c182d3948c9c8c5a70ef78ac6056abe8c3224ac2fba8 WatchSource:0}: Error finding container 0dae0fcf0c5bd618f048c182d3948c9c8c5a70ef78ac6056abe8c3224ac2fba8: Status 404 returned error can't find the container with id 0dae0fcf0c5bd618f048c182d3948c9c8c5a70ef78ac6056abe8c3224ac2fba8 Apr 21 10:04:19.387208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:19.386918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" event={"ID":"5ee61994-bf42-4dfb-8334-fb990a0f5d8f","Type":"ContainerStarted","Data":"53bbd0b357a514c74d5f42c89c2adca7968df1d3410761474767ceb97db1dff6"} Apr 21 10:04:19.388132 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:19.388100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zjd9r" event={"ID":"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d","Type":"ContainerStarted","Data":"0dae0fcf0c5bd618f048c182d3948c9c8c5a70ef78ac6056abe8c3224ac2fba8"} Apr 21 10:04:19.416737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:19.416676 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hzkgn" podStartSLOduration=5.931121652 podStartE2EDuration="37.41665661s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:44.790039035 +0000 UTC m=+3.160839722" lastFinishedPulling="2026-04-21 10:04:16.275573978 +0000 UTC m=+34.646374680" observedRunningTime="2026-04-21 10:04:19.414873771 +0000 UTC m=+37.785674481" watchObservedRunningTime="2026-04-21 10:04:19.41665661 +0000 UTC m=+37.787457325" Apr 21 10:04:20.224690 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:20.224647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:20.224883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:20.224706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:20.224883 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.224805 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:20.224883 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.224812 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:20.224883 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.224833 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:20.224883 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.224878 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.2248581 +0000 UTC m=+42.595658805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:20.225145 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.224892 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.224886355 +0000 UTC m=+42.595687042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:20.326112 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:20.326066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:20.326300 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:20.326156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:20.326300 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.326242 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:20.326300 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.326284 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:20.326457 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.326368 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.326346484 +0000 UTC m=+42.697147184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:20.326457 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:20.326388 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:24.326378438 +0000 UTC m=+42.697179131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:23.396293 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:23.396257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zjd9r" event={"ID":"b16969ed-ca80-4fde-b8cb-9e1cbd9d131d","Type":"ContainerStarted","Data":"67a4c0cff8d87790509320fe40480e9117aed7098d834c51179bfbd0d9483483"} Apr 21 10:04:23.413098 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:23.413048 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zjd9r" podStartSLOduration=33.393114788 podStartE2EDuration="37.413034928s" podCreationTimestamp="2026-04-21 10:03:46 +0000 UTC" firstStartedPulling="2026-04-21 10:04:18.604981008 +0000 UTC m=+36.975781696" lastFinishedPulling="2026-04-21 10:04:22.624901145 +0000 UTC m=+40.995701836" observedRunningTime="2026-04-21 10:04:23.412446952 +0000 UTC m=+41.783247662" watchObservedRunningTime="2026-04-21 10:04:23.413034928 +0000 UTC m=+41.783835637" Apr 21 10:04:24.259604 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:24.259558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:24.259658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.259705 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.259743 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.259754 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.259784 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.259764399 +0000 UTC m=+50.630565087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:24.259820 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.259802 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.259794024 +0000 UTC m=+50.630594717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:24.359967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:24.359926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:24.360124 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:24.359998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:24.360124 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.360113 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:24.360188 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.360123 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:24.360188 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.360180 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.360161413 +0000 UTC m=+50.730962102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:24.360252 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:24.360196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.360188643 +0000 UTC m=+50.730989331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:32.320848 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:32.320807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:32.320857 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.320959 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.320982 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.321043 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.321028003 +0000 UTC m=+66.691828691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.320965 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:32.321287 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.321098 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.321085543 +0000 UTC m=+66.691886230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:32.421778 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:32.421742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:32.421943 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:32.421818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:32.421943 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.421928 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:32.422026 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.421947 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:32.422026 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.422016 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.421999751 +0000 UTC m=+66.792800452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:32.422103 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:32.422031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.422022823 +0000 UTC m=+66.792823509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:41.372930 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:41.372902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nz95q" Apr 21 10:04:47.931076 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:47.931032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:04:47.934287 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:47.934268 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:47.941796 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:47.941776 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:47.941879 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:47.941838 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:51.941823341 +0000 UTC m=+130.312624029 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : secret "metrics-daemon-secret" not found Apr 21 10:04:48.031839 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.031781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:48.034368 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.034346 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:48.044554 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.044518 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:48.056569 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.056526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldntl\" (UniqueName: \"kubernetes.io/projected/4f2983b7-be09-42ac-b5a7-0c43883354da-kube-api-access-ldntl\") pod \"network-check-target-9n5rr\" (UID: \"4f2983b7-be09-42ac-b5a7-0c43883354da\") " pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:48.106226 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.106199 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w68bg\"" Apr 21 10:04:48.114172 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.114143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:48.239411 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.239381 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9n5rr"] Apr 21 10:04:48.242740 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:04:48.242716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2983b7_be09_42ac_b5a7_0c43883354da.slice/crio-f2f8cde8d5ba43cf2376ea92761d3e15ed784009e4746d86abe84fadd4a7cec0 WatchSource:0}: Error finding container f2f8cde8d5ba43cf2376ea92761d3e15ed784009e4746d86abe84fadd4a7cec0: Status 404 returned error can't find the container with id f2f8cde8d5ba43cf2376ea92761d3e15ed784009e4746d86abe84fadd4a7cec0 Apr 21 10:04:48.334104 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.334066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:04:48.334266 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.334120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:04:48.334266 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.334222 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:04:48.334266 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.334224 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:48.334266 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.334245 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:04:48.334394 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.334283 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:20.334268256 +0000 UTC m=+98.705068942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:04:48.334394 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.334298 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:20.334290044 +0000 UTC m=+98.705090730 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:04:48.434700 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.434664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:04:48.434874 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.434726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:04:48.434874 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.434839 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:48.434949 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.434907 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:20.434890947 +0000 UTC m=+98.805691643 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:04:48.434949 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.434839 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:48.435033 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:04:48.434996 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:20.43498103 +0000 UTC m=+98.805781724 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:04:48.445492 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:48.445447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9n5rr" event={"ID":"4f2983b7-be09-42ac-b5a7-0c43883354da","Type":"ContainerStarted","Data":"f2f8cde8d5ba43cf2376ea92761d3e15ed784009e4746d86abe84fadd4a7cec0"} Apr 21 10:04:51.452949 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:51.452912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9n5rr" event={"ID":"4f2983b7-be09-42ac-b5a7-0c43883354da","Type":"ContainerStarted","Data":"9b63605e8c12e1779bc1d5ed408598b9a0e29bd1c3a7267e471c9323da0a9b91"} Apr 21 10:04:51.453312 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:51.453031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:04:51.473601 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:04:51.473556 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9n5rr" podStartSLOduration=66.46341586 podStartE2EDuration="1m9.473522381s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:04:48.245095729 +0000 UTC m=+66.615896419" lastFinishedPulling="2026-04-21 10:04:51.25520225 +0000 UTC m=+69.626002940" observedRunningTime="2026-04-21 10:04:51.47304005 +0000 UTC m=+69.843840769" watchObservedRunningTime="2026-04-21 10:04:51.473522381 +0000 UTC m=+69.844323068" Apr 21 10:05:20.380648 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:20.380607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:20.380666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.380751 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.380772 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.380823 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.380832 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:24.38081815 +0000 UTC m=+162.751618836 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:05:20.381098 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.380892 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:24.38087852 +0000 UTC m=+162.751679207 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:05:20.481222 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:20.481186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:05:20.481455 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:20.481252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:05:20.481455 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.481339 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:20.481455 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.481392 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:24.481378863 +0000 UTC m=+162.852179554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:05:20.481455 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.481338 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:20.481455 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:20.481419 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:24.481413462 +0000 UTC m=+162.852214149 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:05:22.458429 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:22.458399 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9n5rr" Apr 21 10:05:52.017769 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:05:52.017712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:05:52.018269 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:52.017852 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:05:52.018269 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:05:52.017920 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs podName:a931daa8-594d-442d-b462-5f77532314a5 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:54.017904505 +0000 UTC m=+252.388705193 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs") pod "network-metrics-daemon-g9jwm" (UID: "a931daa8-594d-442d-b462-5f77532314a5") : secret "metrics-daemon-secret" not found Apr 21 10:06:15.525719 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.525680 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj"] Apr 21 10:06:15.527844 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.527822 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:15.530445 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.530419 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 10:06:15.530568 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.530423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-29dx2\"" Apr 21 10:06:15.531392 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.531373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.531488 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.531396 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.538338 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.538318 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj"] Apr 21 10:06:15.587448 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.587405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:15.587650 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.587516 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hnzg\" (UniqueName: \"kubernetes.io/projected/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-kube-api-access-5hnzg\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:15.688552 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.688480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hnzg\" (UniqueName: \"kubernetes.io/projected/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-kube-api-access-5hnzg\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:15.688735 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.688578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:15.688735 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:15.688677 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:15.688811 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:15.688747 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls podName:857cd775-bb33-4ee5-a1af-0aebdf3b8a00 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.188729014 +0000 UTC m=+154.559529714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f8lvj" (UID: "857cd775-bb33-4ee5-a1af-0aebdf3b8a00") : secret "samples-operator-tls" not found Apr 21 10:06:15.697841 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:15.697808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hnzg\" (UniqueName: \"kubernetes.io/projected/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-kube-api-access-5hnzg\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:16.192135 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:16.192103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:16.192280 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:16.192237 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:16.192340 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:16.192292 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls podName:857cd775-bb33-4ee5-a1af-0aebdf3b8a00 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:17.192276516 +0000 UTC m=+155.563077204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f8lvj" (UID: "857cd775-bb33-4ee5-a1af-0aebdf3b8a00") : secret "samples-operator-tls" not found Apr 21 10:06:17.201121 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:17.201077 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:17.201638 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:17.201251 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:17.201638 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:17.201339 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls podName:857cd775-bb33-4ee5-a1af-0aebdf3b8a00 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:19.201317564 +0000 UTC m=+157.572118252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f8lvj" (UID: "857cd775-bb33-4ee5-a1af-0aebdf3b8a00") : secret "samples-operator-tls" not found Apr 21 10:06:18.256306 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.256253 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j"] Apr 21 10:06:18.258253 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.258233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.260885 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.260859 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 10:06:18.260993 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.260892 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:18.260993 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.260899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:18.261814 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.261793 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 10:06:18.261872 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.261794 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-42j69\"" Apr 21 10:06:18.269068 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.269040 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j"] Apr 21 10:06:18.309518 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.309478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/087fe162-4bd2-4285-92a8-117f3a58caa3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.309703 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.309556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087fe162-4bd2-4285-92a8-117f3a58caa3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.309703 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.309674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7pn7\" (UniqueName: \"kubernetes.io/projected/087fe162-4bd2-4285-92a8-117f3a58caa3-kube-api-access-j7pn7\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.410185 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.410150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/087fe162-4bd2-4285-92a8-117f3a58caa3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.410318 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.410201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087fe162-4bd2-4285-92a8-117f3a58caa3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.410449 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.410420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7pn7\" (UniqueName: \"kubernetes.io/projected/087fe162-4bd2-4285-92a8-117f3a58caa3-kube-api-access-j7pn7\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.411427 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.411403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087fe162-4bd2-4285-92a8-117f3a58caa3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.412616 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.412597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/087fe162-4bd2-4285-92a8-117f3a58caa3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.420489 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.420471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7pn7\" (UniqueName: \"kubernetes.io/projected/087fe162-4bd2-4285-92a8-117f3a58caa3-kube-api-access-j7pn7\") pod \"kube-storage-version-migrator-operator-6769c5d45-jwp2j\" (UID: \"087fe162-4bd2-4285-92a8-117f3a58caa3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.567207 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.567172 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" Apr 21 10:06:18.683144 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:18.683110 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j"] Apr 21 10:06:18.686913 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:18.686879 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087fe162_4bd2_4285_92a8_117f3a58caa3.slice/crio-84306f5db1e0c0a78e65be2e6b8aa932976c252428a650b690713c89c2649fc8 WatchSource:0}: Error finding container 84306f5db1e0c0a78e65be2e6b8aa932976c252428a650b690713c89c2649fc8: Status 404 returned error can't find the container with id 84306f5db1e0c0a78e65be2e6b8aa932976c252428a650b690713c89c2649fc8 Apr 21 10:06:19.218155 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:19.218078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:19.218318 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.218199 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:19.218318 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.218275 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls podName:857cd775-bb33-4ee5-a1af-0aebdf3b8a00 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.218257433 +0000 UTC m=+161.589058126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f8lvj" (UID: "857cd775-bb33-4ee5-a1af-0aebdf3b8a00") : secret "samples-operator-tls" not found Apr 21 10:06:19.500939 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.500842 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" podUID="1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0" Apr 21 10:06:19.509991 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.509956 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" podUID="e9bd5820-04e6-410f-bda9-b7b67da26521" Apr 21 10:06:19.536310 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.536269 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bqc7r" podUID="2a4340c5-5a53-4cd3-b487-d469b4bb82c5" Apr 21 10:06:19.621026 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:19.620992 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:06:19.621197 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:19.621035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:06:19.621197 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:19.620990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" event={"ID":"087fe162-4bd2-4285-92a8-117f3a58caa3","Type":"ContainerStarted","Data":"84306f5db1e0c0a78e65be2e6b8aa932976c252428a650b690713c89c2649fc8"} Apr 21 10:06:19.645669 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:19.645633 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cd55l" podUID="dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1" Apr 21 10:06:20.208071 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:20.208014 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-g9jwm" podUID="a931daa8-594d-442d-b462-5f77532314a5" Apr 21 10:06:20.502624 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:20.502548 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-594w4_0622ef89-a9c2-4672-891f-4e52ebb096b4/dns-node-resolver/0.log" Apr 21 10:06:20.623877 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:20.623847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:06:20.623877 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:20.623857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" event={"ID":"087fe162-4bd2-4285-92a8-117f3a58caa3","Type":"ContainerStarted","Data":"6b4c3aa3af8bb3ba7ad7acd1bbf56d28d521443eb83800a6014f78288a017ca9"} Apr 21 10:06:20.643354 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:20.643297 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" podStartSLOduration=1.016062221 podStartE2EDuration="2.64328159s" podCreationTimestamp="2026-04-21 10:06:18 +0000 UTC" firstStartedPulling="2026-04-21 10:06:18.688561141 +0000 UTC m=+157.059361831" lastFinishedPulling="2026-04-21 10:06:20.315780511 +0000 UTC m=+158.686581200" observedRunningTime="2026-04-21 10:06:20.642600515 +0000 UTC m=+159.013401227" watchObservedRunningTime="2026-04-21 10:06:20.64328159 +0000 UTC m=+159.014082298" Apr 21 10:06:21.587249 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.587220 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zl4kn"] Apr 21 10:06:21.589296 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.589278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.591717 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.591681 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 10:06:21.591717 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.591710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 10:06:21.592742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.592723 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ljqww\"" Apr 21 10:06:21.592882 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.592724 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 10:06:21.593226 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.593211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 10:06:21.598319 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.598301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zl4kn"] Apr 21 10:06:21.639480 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.639453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86b36699-47c8-439f-9a5b-acbb2500525e-signing-cabundle\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.639648 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.639506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86b36699-47c8-439f-9a5b-acbb2500525e-signing-key\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.639648 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.639527 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k62b\" (UniqueName: \"kubernetes.io/projected/86b36699-47c8-439f-9a5b-acbb2500525e-kube-api-access-8k62b\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.740687 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.740645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86b36699-47c8-439f-9a5b-acbb2500525e-signing-key\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.740687 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.740694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k62b\" (UniqueName: \"kubernetes.io/projected/86b36699-47c8-439f-9a5b-acbb2500525e-kube-api-access-8k62b\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.740970 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.740949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86b36699-47c8-439f-9a5b-acbb2500525e-signing-cabundle\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.741589 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.741564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86b36699-47c8-439f-9a5b-acbb2500525e-signing-cabundle\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.743007 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.742987 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86b36699-47c8-439f-9a5b-acbb2500525e-signing-key\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.749189 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.749168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k62b\" (UniqueName: \"kubernetes.io/projected/86b36699-47c8-439f-9a5b-acbb2500525e-kube-api-access-8k62b\") pod \"service-ca-865cb79987-zl4kn\" (UID: \"86b36699-47c8-439f-9a5b-acbb2500525e\") " pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.898940 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.898854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zl4kn" Apr 21 10:06:21.901822 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:21.901799 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ffsc5_b9feaa36-784e-406f-b11b-9f103755a6a0/node-ca/0.log" Apr 21 10:06:22.014598 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:22.014568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zl4kn"] Apr 21 10:06:22.017664 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:22.017613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86b36699_47c8_439f_9a5b_acbb2500525e.slice/crio-761b6e4c9504d8c93631b3b14509919254c1499e21612aa18c73b70b29dfa89d WatchSource:0}: Error finding container 761b6e4c9504d8c93631b3b14509919254c1499e21612aa18c73b70b29dfa89d: Status 404 returned error can't find the container with id 761b6e4c9504d8c93631b3b14509919254c1499e21612aa18c73b70b29dfa89d Apr 21 10:06:22.628928 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:22.628888 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zl4kn" event={"ID":"86b36699-47c8-439f-9a5b-acbb2500525e","Type":"ContainerStarted","Data":"761b6e4c9504d8c93631b3b14509919254c1499e21612aa18c73b70b29dfa89d"} Apr 21 10:06:23.252742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:23.252697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:23.252936 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:23.252896 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:23.252990 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:23.252978 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls podName:857cd775-bb33-4ee5-a1af-0aebdf3b8a00 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:31.252954829 +0000 UTC m=+169.623755523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f8lvj" (UID: "857cd775-bb33-4ee5-a1af-0aebdf3b8a00") : secret "samples-operator-tls" not found Apr 21 10:06:23.632624 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:23.632583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zl4kn" event={"ID":"86b36699-47c8-439f-9a5b-acbb2500525e","Type":"ContainerStarted","Data":"8cbaef92e8cdbaa12e00ce735ee7dbb238396b188ae2c51926c5f98f5f001143"} Apr 21 10:06:23.650730 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:23.650683 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zl4kn" podStartSLOduration=1.191001948 podStartE2EDuration="2.650668648s" podCreationTimestamp="2026-04-21 10:06:21 +0000 UTC" firstStartedPulling="2026-04-21 10:06:22.019433436 +0000 UTC m=+160.390234122" lastFinishedPulling="2026-04-21 10:06:23.479100135 +0000 UTC m=+161.849900822" observedRunningTime="2026-04-21 10:06:23.649104346 +0000 UTC m=+162.019905067" watchObservedRunningTime="2026-04-21 10:06:23.650668648 +0000 UTC m=+162.021469356" Apr 21 10:06:24.463469 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:24.463427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") pod \"image-registry-7d85fbdf48-88xdd\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:24.463498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.463559 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.463577 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d85fbdf48-88xdd: secret "image-registry-tls" not found Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.463601 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.463653 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls podName:e9bd5820-04e6-410f-bda9-b7b67da26521 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:26.463632945 +0000 UTC m=+284.834433638 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls") pod "image-registry-7d85fbdf48-88xdd" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521") : secret "image-registry-tls" not found Apr 21 10:06:24.463692 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.463671 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert podName:1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:26.463661733 +0000 UTC m=+284.834462419 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9gkbz" (UID: "1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0") : secret "networking-console-plugin-cert" not found Apr 21 10:06:24.564508 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:24.564432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:06:24.564701 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:24.564512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:06:24.564701 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.564588 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:06:24.564701 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.564661 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:06:24.564701 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.564666 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls podName:2a4340c5-5a53-4cd3-b487-d469b4bb82c5 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:26.564646185 +0000 UTC m=+284.935446879 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls") pod "dns-default-bqc7r" (UID: "2a4340c5-5a53-4cd3-b487-d469b4bb82c5") : secret "dns-default-metrics-tls" not found Apr 21 10:06:24.564840 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:24.564718 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert podName:dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:26.564701866 +0000 UTC m=+284.935502563 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert") pod "ingress-canary-cd55l" (UID: "dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1") : secret "canary-serving-cert" not found Apr 21 10:06:31.319397 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:31.319350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:31.321889 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:31.321862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/857cd775-bb33-4ee5-a1af-0aebdf3b8a00-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f8lvj\" (UID: \"857cd775-bb33-4ee5-a1af-0aebdf3b8a00\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:31.436505 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:31.436461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" Apr 21 10:06:31.563515 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:31.563476 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj"] Apr 21 10:06:31.650544 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:31.650496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" event={"ID":"857cd775-bb33-4ee5-a1af-0aebdf3b8a00","Type":"ContainerStarted","Data":"8f1f8aad83a743499ccb96e68e850d601f9ec6511a98a85d17bce993b32bc045"} Apr 21 10:06:33.187087 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:33.187055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:06:33.656747 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:33.656708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" event={"ID":"857cd775-bb33-4ee5-a1af-0aebdf3b8a00","Type":"ContainerStarted","Data":"6facc0e49e0eb3cfffa87ba6623f0e0b8719fad0f34a02cbcd15d70e602fa7c1"} Apr 21 10:06:33.656747 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:33.656745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" event={"ID":"857cd775-bb33-4ee5-a1af-0aebdf3b8a00","Type":"ContainerStarted","Data":"2304c81cae6e6a589ccea02cbf49b2cea27208cd237d30d5b26531fa9e0ed569"} Apr 21 10:06:33.674473 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:33.674415 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f8lvj" podStartSLOduration=17.240081604 podStartE2EDuration="18.674398834s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:31.602982373 +0000 UTC m=+169.973783060" lastFinishedPulling="2026-04-21 10:06:33.037299592 +0000 UTC m=+171.408100290" observedRunningTime="2026-04-21 10:06:33.673486428 +0000 UTC m=+172.044287137" watchObservedRunningTime="2026-04-21 10:06:33.674398834 +0000 UTC m=+172.045199543" Apr 21 10:06:35.187080 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:35.187044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:06:44.420084 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.420049 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8npvl"] Apr 21 10:06:44.425457 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.425438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.427914 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.427879 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:44.428893 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.428871 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:44.428995 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.428871 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:44.428995 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.428936 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fdjpw\"" Apr 21 10:06:44.429110 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.429007 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:44.433555 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.433518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8npvl"] Apr 21 10:06:44.524157 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.524122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.524353 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.524188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.524353 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.524255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-crio-socket\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.524353 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.524292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cwt\" (UniqueName: \"kubernetes.io/projected/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-api-access-s5cwt\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.524493 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.524394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-data-volume\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625326 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625326 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-crio-socket\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625626 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cwt\" (UniqueName: \"kubernetes.io/projected/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-api-access-s5cwt\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625626 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-data-volume\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625626 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625626 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-crio-socket\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.625901 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.625881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-data-volume\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.626073 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.626058 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.627811 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.627788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.633510 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.633483 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cwt\" (UniqueName: \"kubernetes.io/projected/1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e-kube-api-access-s5cwt\") pod \"insights-runtime-extractor-8npvl\" (UID: \"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e\") " pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.735777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.735691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8npvl" Apr 21 10:06:44.864131 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:44.864098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8npvl"] Apr 21 10:06:44.867069 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:44.867042 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c1cffa6_d9b7_4ec6_8295_0fe0de45a40e.slice/crio-97571177a2e55ae785ad14caa84000966cbe013ac3602aa224c3b27cbf4233b1 WatchSource:0}: Error finding container 97571177a2e55ae785ad14caa84000966cbe013ac3602aa224c3b27cbf4233b1: Status 404 returned error can't find the container with id 97571177a2e55ae785ad14caa84000966cbe013ac3602aa224c3b27cbf4233b1 Apr 21 10:06:45.687967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:45.687932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8npvl" event={"ID":"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e","Type":"ContainerStarted","Data":"a08296079af4fadde58eda49a7501b647a5a7cef4e85990ec49dca830d12044d"} Apr 21 10:06:45.687967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:45.687968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8npvl" event={"ID":"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e","Type":"ContainerStarted","Data":"e32a421dc5b770f1685f747b1847b1cae9206d72ffc651fbb99dd004d824ef67"} Apr 21 10:06:45.688350 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:45.687977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8npvl" event={"ID":"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e","Type":"ContainerStarted","Data":"97571177a2e55ae785ad14caa84000966cbe013ac3602aa224c3b27cbf4233b1"} Apr 21 10:06:47.695108 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:47.695074 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8npvl" event={"ID":"1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e","Type":"ContainerStarted","Data":"96919710bc1b41e3a8adc5466a7ac3e49a8218909f65dfcbdea0aadc282b43b5"} Apr 21 10:06:47.711493 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:47.711449 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8npvl" podStartSLOduration=1.651299609 podStartE2EDuration="3.711434125s" podCreationTimestamp="2026-04-21 10:06:44 +0000 UTC" firstStartedPulling="2026-04-21 10:06:44.925606562 +0000 UTC m=+183.296407249" lastFinishedPulling="2026-04-21 10:06:46.985741074 +0000 UTC m=+185.356541765" observedRunningTime="2026-04-21 10:06:47.711169466 +0000 UTC m=+186.081970175" watchObservedRunningTime="2026-04-21 10:06:47.711434125 +0000 UTC m=+186.082234833" Apr 21 10:06:49.536243 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.536205 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8"] Apr 21 10:06:49.539129 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.539113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:49.541886 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.541860 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 10:06:49.541886 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.541877 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2rnqz\"" Apr 21 10:06:49.547572 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.547525 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8"] Apr 21 10:06:49.667194 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.667156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lhrt8\" (UID: \"821ef817-ee5c-486f-9814-da4d97b80753\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:49.768441 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:49.768397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lhrt8\" (UID: \"821ef817-ee5c-486f-9814-da4d97b80753\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:49.768712 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:49.768583 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 10:06:49.768712 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:06:49.768663 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates podName:821ef817-ee5c-486f-9814-da4d97b80753 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:50.268643355 +0000 UTC m=+188.639444068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-lhrt8" (UID: "821ef817-ee5c-486f-9814-da4d97b80753") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 10:06:50.272816 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:50.272780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lhrt8\" (UID: \"821ef817-ee5c-486f-9814-da4d97b80753\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:50.275199 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:50.275178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/821ef817-ee5c-486f-9814-da4d97b80753-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lhrt8\" (UID: \"821ef817-ee5c-486f-9814-da4d97b80753\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:50.449927 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:50.449893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:50.575340 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:50.575308 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8"] Apr 21 10:06:50.589420 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:50.589388 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821ef817_ee5c_486f_9814_da4d97b80753.slice/crio-785a534f385a08c939d43a7b3a43aba99c121f2d08568210b20218fa98c6e7ff WatchSource:0}: Error finding container 785a534f385a08c939d43a7b3a43aba99c121f2d08568210b20218fa98c6e7ff: Status 404 returned error can't find the container with id 785a534f385a08c939d43a7b3a43aba99c121f2d08568210b20218fa98c6e7ff Apr 21 10:06:50.703292 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:50.703250 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" event={"ID":"821ef817-ee5c-486f-9814-da4d97b80753","Type":"ContainerStarted","Data":"785a534f385a08c939d43a7b3a43aba99c121f2d08568210b20218fa98c6e7ff"} Apr 21 10:06:51.707304 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:51.707256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" event={"ID":"821ef817-ee5c-486f-9814-da4d97b80753","Type":"ContainerStarted","Data":"d6700ca98814f3b9830f6f3a5e8dd508f3bd3b745ed43f02ef55cbfc5adfdbdf"} Apr 21 10:06:51.707777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:51.707461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:51.712811 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:51.712788 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" Apr 21 10:06:51.722843 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:51.722799 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lhrt8" podStartSLOduration=1.74097497 podStartE2EDuration="2.722788859s" podCreationTimestamp="2026-04-21 10:06:49 +0000 UTC" firstStartedPulling="2026-04-21 10:06:50.591767699 +0000 UTC m=+188.962568387" lastFinishedPulling="2026-04-21 10:06:51.573581582 +0000 UTC m=+189.944382276" observedRunningTime="2026-04-21 10:06:51.721025543 +0000 UTC m=+190.091826253" watchObservedRunningTime="2026-04-21 10:06:51.722788859 +0000 UTC m=+190.093589564" Apr 21 10:06:57.050022 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.049981 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wd84q"] Apr 21 10:06:57.053617 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.053594 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v6kj4"] Apr 21 10:06:57.053767 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.053746 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.056388 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.056360 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:06:57.056857 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.056841 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.057753 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057528 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:06:57.057753 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057545 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:06:57.057753 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:06:57.057753 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057571 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:06:57.058001 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057818 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5rh4g\"" Apr 21 10:06:57.058001 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.057910 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:06:57.059076 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.059058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-m5ncz\"" Apr 21 10:06:57.059488 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.059469 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 10:06:57.059604 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.059511 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 10:06:57.059772 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.059757 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 10:06:57.069353 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.069326 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v6kj4"] Apr 21 10:06:57.227108 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.227307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-textfile\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-metrics-client-ca\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhntn\" (UniqueName: \"kubernetes.io/projected/d5192e80-1d1c-45a0-9dea-817499443dd0-kube-api-access-nhntn\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-sys\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-tls\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tcs8\" (UniqueName: \"kubernetes.io/projected/fb46e263-367f-4628-8a44-6e443f2c276d-kube-api-access-2tcs8\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.227471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.227471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.227679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-root\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.227679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-wtmp\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.227679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.227647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fb46e263-367f-4628-8a44-6e443f2c276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fb46e263-367f-4628-8a44-6e443f2c276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-textfile\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-metrics-client-ca\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhntn\" (UniqueName: \"kubernetes.io/projected/d5192e80-1d1c-45a0-9dea-817499443dd0-kube-api-access-nhntn\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-sys\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-sys\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-tls\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tcs8\" (UniqueName: \"kubernetes.io/projected/fb46e263-367f-4628-8a44-6e443f2c276d-kube-api-access-2tcs8\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.328677 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-root\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fb46e263-367f-4628-8a44-6e443f2c276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-textfile\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-wtmp\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-wtmp\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.328976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-accelerators-collector-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329096 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.329052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d5192e80-1d1c-45a0-9dea-817499443dd0-root\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329469 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.329130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5192e80-1d1c-45a0-9dea-817499443dd0-metrics-client-ca\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.329469 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.329327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.329597 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.329568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb46e263-367f-4628-8a44-6e443f2c276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.331290 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.331251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-tls\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.331461 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.331442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.331643 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.331623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5192e80-1d1c-45a0-9dea-817499443dd0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.331755 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.331737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb46e263-367f-4628-8a44-6e443f2c276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.336432 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.336410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhntn\" (UniqueName: \"kubernetes.io/projected/d5192e80-1d1c-45a0-9dea-817499443dd0-kube-api-access-nhntn\") pod \"node-exporter-wd84q\" (UID: \"d5192e80-1d1c-45a0-9dea-817499443dd0\") " pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.336557 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.336411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tcs8\" (UniqueName: \"kubernetes.io/projected/fb46e263-367f-4628-8a44-6e443f2c276d-kube-api-access-2tcs8\") pod \"kube-state-metrics-69db897b98-v6kj4\" (UID: \"fb46e263-367f-4628-8a44-6e443f2c276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.366595 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.366553 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wd84q" Apr 21 10:06:57.372376 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.372346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" Apr 21 10:06:57.374717 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:57.374689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5192e80_1d1c_45a0_9dea_817499443dd0.slice/crio-e7f8c68c8f9c42f152771d8753235701e5a7dbcaa8d59657dc5e6812a5c0f283 WatchSource:0}: Error finding container e7f8c68c8f9c42f152771d8753235701e5a7dbcaa8d59657dc5e6812a5c0f283: Status 404 returned error can't find the container with id e7f8c68c8f9c42f152771d8753235701e5a7dbcaa8d59657dc5e6812a5c0f283 Apr 21 10:06:57.499284 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.499238 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v6kj4"] Apr 21 10:06:57.502186 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:57.502159 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb46e263_367f_4628_8a44_6e443f2c276d.slice/crio-d9d2c0350ae08b99649f4026a224df3dad282220fc546abc3c7724c5118fcd9d WatchSource:0}: Error finding container d9d2c0350ae08b99649f4026a224df3dad282220fc546abc3c7724c5118fcd9d: Status 404 returned error can't find the container with id d9d2c0350ae08b99649f4026a224df3dad282220fc546abc3c7724c5118fcd9d Apr 21 10:06:57.724197 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.724097 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" event={"ID":"fb46e263-367f-4628-8a44-6e443f2c276d","Type":"ContainerStarted","Data":"d9d2c0350ae08b99649f4026a224df3dad282220fc546abc3c7724c5118fcd9d"} Apr 21 10:06:57.725116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:57.725097 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wd84q" event={"ID":"d5192e80-1d1c-45a0-9dea-817499443dd0","Type":"ContainerStarted","Data":"e7f8c68c8f9c42f152771d8753235701e5a7dbcaa8d59657dc5e6812a5c0f283"} Apr 21 10:06:58.077795 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.077758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:58.082021 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.081998 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.085109 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.085081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 10:06:58.085497 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.085478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 10:06:58.086240 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.086216 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 10:06:58.086337 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.086307 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 10:06:58.086393 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.086330 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 10:06:58.086688 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.086671 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 10:06:58.087078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.086725 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 10:06:58.087390 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.087371 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 10:06:58.088487 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.088221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6v8sh\"" Apr 21 10:06:58.097733 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.097574 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 10:06:58.099856 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.099832 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:58.237915 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.237890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238047 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.237938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238047 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.237980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238047 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlm7\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.238462 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.238330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339294 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339294 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339517 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339934 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339934 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlm7\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.339934 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.339657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.341102 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.340220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.341102 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.340596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.341102 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.341059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.342891 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.342865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.343131 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.343106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.343745 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.343408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.343745 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.343573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.343902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.343845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.344232 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.344208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.344232 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.344222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.344587 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.344568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.344822 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.344803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.349264 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.349242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlm7\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7\") pod \"alertmanager-main-0\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.398393 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.398355 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:58.637027 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.636994 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:58.640745 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:06:58.640702 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf7f0eb5_4d61_4bc5_8eb8_7be3eec9c780.slice/crio-482eb58361898789dc14397c64bd3af7e4df56e511d6c6c55812810229d42bfa WatchSource:0}: Error finding container 482eb58361898789dc14397c64bd3af7e4df56e511d6c6c55812810229d42bfa: Status 404 returned error can't find the container with id 482eb58361898789dc14397c64bd3af7e4df56e511d6c6c55812810229d42bfa Apr 21 10:06:58.730236 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.730200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" event={"ID":"fb46e263-367f-4628-8a44-6e443f2c276d","Type":"ContainerStarted","Data":"0d35e12fd8fbc748e0d5b92e4624d5cd313789e84eeb6caa09b316b44116d0eb"} Apr 21 10:06:58.730406 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.730245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" event={"ID":"fb46e263-367f-4628-8a44-6e443f2c276d","Type":"ContainerStarted","Data":"92de38c769f51cd9f6dc56eee1cf0d8baece9c1404dba076dad53f2c8978ccb3"} Apr 21 10:06:58.731397 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.731367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"482eb58361898789dc14397c64bd3af7e4df56e511d6c6c55812810229d42bfa"} Apr 21 10:06:58.733001 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.732920 2573 generic.go:358] "Generic (PLEG): container finished" podID="d5192e80-1d1c-45a0-9dea-817499443dd0" containerID="01587a7e38fcc8b72137261a20233d33559c22cae5e8a1237effffb0f1f55212" exitCode=0 Apr 21 10:06:58.733001 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:58.732981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wd84q" event={"ID":"d5192e80-1d1c-45a0-9dea-817499443dd0","Type":"ContainerDied","Data":"01587a7e38fcc8b72137261a20233d33559c22cae5e8a1237effffb0f1f55212"} Apr 21 10:06:59.737145 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:59.736942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" event={"ID":"fb46e263-367f-4628-8a44-6e443f2c276d","Type":"ContainerStarted","Data":"79ba25282cf1bb60a1949fc61abdcd1184a9f703173c4f03d6c2e871ff18fe0a"} Apr 21 10:06:59.739007 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:59.738984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wd84q" event={"ID":"d5192e80-1d1c-45a0-9dea-817499443dd0","Type":"ContainerStarted","Data":"05b1537d11e802e9c47925b31ed4b0bce7e81f2ae0a553a8646aab36981d8bef"} Apr 21 10:06:59.739084 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:59.739011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wd84q" event={"ID":"d5192e80-1d1c-45a0-9dea-817499443dd0","Type":"ContainerStarted","Data":"2d088134e03b06db29d601f20458a3ad20d3efd1247dbf062955a864fbf4c570"} Apr 21 10:06:59.781080 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:59.780925 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wd84q" podStartSLOduration=1.982453661 podStartE2EDuration="2.780905047s" podCreationTimestamp="2026-04-21 10:06:57 +0000 UTC" firstStartedPulling="2026-04-21 10:06:57.377491583 +0000 UTC m=+195.748292269" lastFinishedPulling="2026-04-21 10:06:58.175934822 +0000 UTC m=+196.546743655" observedRunningTime="2026-04-21 10:06:59.780383155 +0000 UTC m=+198.151183875" watchObservedRunningTime="2026-04-21 10:06:59.780905047 +0000 UTC m=+198.151705759" Apr 21 10:06:59.781742 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:06:59.781709 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-v6kj4" podStartSLOduration=1.732425953 podStartE2EDuration="2.781698002s" podCreationTimestamp="2026-04-21 10:06:57 +0000 UTC" firstStartedPulling="2026-04-21 10:06:57.504077056 +0000 UTC m=+195.874877742" lastFinishedPulling="2026-04-21 10:06:58.553349103 +0000 UTC m=+196.924149791" observedRunningTime="2026-04-21 10:06:59.760423025 +0000 UTC m=+198.131223771" watchObservedRunningTime="2026-04-21 10:06:59.781698002 +0000 UTC m=+198.152498712" Apr 21 10:07:00.743238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:00.743205 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89" exitCode=0 Apr 21 10:07:00.743624 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:00.743307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89"} Apr 21 10:07:01.397769 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.397732 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-fb55fcdd5-67vqq"] Apr 21 10:07:01.401038 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.401015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.404129 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.404110 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 10:07:01.405091 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.405069 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-74ktm\"" Apr 21 10:07:01.405223 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.405072 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 10:07:01.405441 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.405330 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d4r7jigeu0phq\"" Apr 21 10:07:01.405601 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.405444 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 10:07:01.405601 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.405471 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 10:07:01.417620 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.417593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fb55fcdd5-67vqq"] Apr 21 10:07:01.572307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-tls\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572307 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq2xs\" (UniqueName: \"kubernetes.io/projected/049e9c77-bac7-43db-b890-f3604cc9398b-kube-api-access-gq2xs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-client-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-metrics-server-audit-profiles\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572586 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/049e9c77-bac7-43db-b890-f3604cc9398b-audit-log\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572738 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-client-certs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.572738 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.572635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.673959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-client-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-metrics-server-audit-profiles\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/049e9c77-bac7-43db-b890-f3604cc9398b-audit-log\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-client-certs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-tls\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.674277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq2xs\" (UniqueName: \"kubernetes.io/projected/049e9c77-bac7-43db-b890-f3604cc9398b-kube-api-access-gq2xs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.675712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.675337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.677569 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.677230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/049e9c77-bac7-43db-b890-f3604cc9398b-metrics-server-audit-profiles\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.677569 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.677511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/049e9c77-bac7-43db-b890-f3604cc9398b-audit-log\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.680488 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.680300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-tls\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.681675 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.681629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-secret-metrics-server-client-certs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.682257 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.682233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049e9c77-bac7-43db-b890-f3604cc9398b-client-ca-bundle\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.684213 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.684178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq2xs\" (UniqueName: \"kubernetes.io/projected/049e9c77-bac7-43db-b890-f3604cc9398b-kube-api-access-gq2xs\") pod \"metrics-server-fb55fcdd5-67vqq\" (UID: \"049e9c77-bac7-43db-b890-f3604cc9398b\") " pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.712615 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.712583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:01.856093 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:01.856050 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fb55fcdd5-67vqq"] Apr 21 10:07:01.859451 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:07:01.859421 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049e9c77_bac7_43db_b890_f3604cc9398b.slice/crio-fd4aad16ff8188dda0da553c162e73413a8a61b18eea205185caf781e8f1478a WatchSource:0}: Error finding container fd4aad16ff8188dda0da553c162e73413a8a61b18eea205185caf781e8f1478a: Status 404 returned error can't find the container with id fd4aad16ff8188dda0da553c162e73413a8a61b18eea205185caf781e8f1478a Apr 21 10:07:02.753370 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:02.753265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8"} Apr 21 10:07:02.753370 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:02.753307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91"} Apr 21 10:07:02.753370 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:02.753323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f"} Apr 21 10:07:02.753370 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:02.753336 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a"} Apr 21 10:07:02.755350 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:02.755249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" event={"ID":"049e9c77-bac7-43db-b890-f3604cc9398b","Type":"ContainerStarted","Data":"fd4aad16ff8188dda0da553c162e73413a8a61b18eea205185caf781e8f1478a"} Apr 21 10:07:03.229870 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.229834 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:03.234422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.234397 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.237655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.237629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 10:07:03.239485 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.239463 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 10:07:03.239785 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.239708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4656c7cvjvu8f\"" Apr 21 10:07:03.240726 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.240705 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 10:07:03.240838 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.240814 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 10:07:03.241378 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.241158 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qdzph\"" Apr 21 10:07:03.241378 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.241197 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 10:07:03.242078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.241996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 10:07:03.242175 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.242084 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 10:07:03.242175 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.242132 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 10:07:03.242636 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.242570 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 10:07:03.242814 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.242792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 10:07:03.243020 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.243001 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 10:07:03.243854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.243833 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 10:07:03.257774 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.257747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:03.391446 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391446 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391672 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391672 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391771 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrdv\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391771 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391771 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391885 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391916 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.391964 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.391980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392056 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392103 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392154 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392154 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.392235 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.392218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493017 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.492985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493129 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493129 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrdv\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.493641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494015 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.493838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494433 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.494322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494513 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.494492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494598 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.494517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.494891 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.494792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.495674 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.495327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.496701 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.496219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.496701 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.496396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.496701 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.496658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.496958 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.496908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.498301 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.498261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.498634 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.498529 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.498634 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.498608 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.498862 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.498846 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.499358 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.499336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.499634 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.499614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.499754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.499732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.499957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.499941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.508714 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.508683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrdv\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv\") pod \"prometheus-k8s-0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.547000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.546964 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:03.699508 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.699467 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:03.762072 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.761992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665"} Apr 21 10:07:03.763527 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:03.763493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" event={"ID":"049e9c77-bac7-43db-b890-f3604cc9398b","Type":"ContainerStarted","Data":"7d0a8f62b963bb671d07a7f0f1f0944ea64319b844a6986cd7776fc9dfecb115"} Apr 21 10:07:03.775414 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:07:03.775381 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a1374f6_cd56_4f0d_bff6_4f9d32a4b1f0.slice/crio-8eac4b4ff20b54519a1ec76fff3ff2a0e9ee6a1c582932746161ca494ba1a904 WatchSource:0}: Error finding container 8eac4b4ff20b54519a1ec76fff3ff2a0e9ee6a1c582932746161ca494ba1a904: Status 404 returned error can't find the container with id 8eac4b4ff20b54519a1ec76fff3ff2a0e9ee6a1c582932746161ca494ba1a904 Apr 21 10:07:04.771376 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.771343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerStarted","Data":"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8"} Apr 21 10:07:04.772635 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.772615 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" exitCode=0 Apr 21 10:07:04.772737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.772693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} Apr 21 10:07:04.772737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.772723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"8eac4b4ff20b54519a1ec76fff3ff2a0e9ee6a1c582932746161ca494ba1a904"} Apr 21 10:07:04.803617 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.803568 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.6249519860000001 podStartE2EDuration="6.80355428s" podCreationTimestamp="2026-04-21 10:06:58 +0000 UTC" firstStartedPulling="2026-04-21 10:06:58.643481831 +0000 UTC m=+197.014282531" lastFinishedPulling="2026-04-21 10:07:03.822084103 +0000 UTC m=+202.192884825" observedRunningTime="2026-04-21 10:07:04.80260869 +0000 UTC m=+203.173409400" watchObservedRunningTime="2026-04-21 10:07:04.80355428 +0000 UTC m=+203.174354980" Apr 21 10:07:04.803899 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:04.803878 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" podStartSLOduration=2.195132587 podStartE2EDuration="3.803872121s" podCreationTimestamp="2026-04-21 10:07:01 +0000 UTC" firstStartedPulling="2026-04-21 10:07:01.863302269 +0000 UTC m=+200.234102957" lastFinishedPulling="2026-04-21 10:07:03.472041805 +0000 UTC m=+201.842842491" observedRunningTime="2026-04-21 10:07:03.788373099 +0000 UTC m=+202.159173808" watchObservedRunningTime="2026-04-21 10:07:04.803872121 +0000 UTC m=+203.174672829" Apr 21 10:07:06.811361 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:06.811332 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d85fbdf48-88xdd"] Apr 21 10:07:06.811774 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:07:06.811633 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" podUID="e9bd5820-04e6-410f-bda9-b7b67da26521" Apr 21 10:07:07.786560 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.786085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:07:07.786560 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.786208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} Apr 21 10:07:07.786560 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.786245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} Apr 21 10:07:07.792636 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.792610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.836845 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.836917 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.836950 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.836975 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.837007 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.837103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.837855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.837148 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9h9n\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n\") pod \"e9bd5820-04e6-410f-bda9-b7b67da26521\" (UID: \"e9bd5820-04e6-410f-bda9-b7b67da26521\") " Apr 21 10:07:07.838942 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.838911 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:07.839116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.839092 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:07:07.839456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.839431 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:07.840475 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.840435 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:07.840848 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.840690 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:07.841003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.840974 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n" (OuterVolumeSpecName: "kube-api-access-d9h9n") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "kube-api-access-d9h9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:07.841994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.841965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e9bd5820-04e6-410f-bda9-b7b67da26521" (UID: "e9bd5820-04e6-410f-bda9-b7b67da26521"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:07.938418 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938381 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-trusted-ca\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938418 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938409 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d9h9n\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-kube-api-access-d9h9n\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938418 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938420 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-bound-sa-token\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938671 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938429 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-image-registry-private-configuration\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938671 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938439 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9bd5820-04e6-410f-bda9-b7b67da26521-ca-trust-extracted\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938671 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938448 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9bd5820-04e6-410f-bda9-b7b67da26521-installation-pull-secrets\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:07.938671 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:07.938456 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-certificates\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:08.789292 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:08.789258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d85fbdf48-88xdd" Apr 21 10:07:08.827057 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:08.826977 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d85fbdf48-88xdd"] Apr 21 10:07:08.834551 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:08.834511 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7d85fbdf48-88xdd"] Apr 21 10:07:08.947865 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:08.947830 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9bd5820-04e6-410f-bda9-b7b67da26521-registry-tls\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.794408 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:09.794377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} Apr 21 10:07:09.794408 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:09.794410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} Apr 21 10:07:09.794623 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:09.794422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} Apr 21 10:07:09.794623 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:09.794430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerStarted","Data":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} Apr 21 10:07:09.822720 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:09.822656 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.352188812 podStartE2EDuration="6.822641116s" podCreationTimestamp="2026-04-21 10:07:03 +0000 UTC" firstStartedPulling="2026-04-21 10:07:04.773732132 +0000 UTC m=+203.144532819" lastFinishedPulling="2026-04-21 10:07:09.244184428 +0000 UTC m=+207.614985123" observedRunningTime="2026-04-21 10:07:09.82130206 +0000 UTC m=+208.192102813" watchObservedRunningTime="2026-04-21 10:07:09.822641116 +0000 UTC m=+208.193441824" Apr 21 10:07:10.191911 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:10.191827 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bd5820-04e6-410f-bda9-b7b67da26521" path="/var/lib/kubelet/pods/e9bd5820-04e6-410f-bda9-b7b67da26521/volumes" Apr 21 10:07:13.547732 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:13.547694 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:21.712783 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:21.712750 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:21.712783 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:21.712789 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:34.798084 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:34.798053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-594w4_0622ef89-a9c2-4672-891f-4e52ebb096b4/dns-node-resolver/0.log" Apr 21 10:07:41.718250 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:41.718216 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:41.722142 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:41.722124 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-fb55fcdd5-67vqq" Apr 21 10:07:45.892529 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:45.892498 2573 generic.go:358] "Generic (PLEG): container finished" podID="087fe162-4bd2-4285-92a8-117f3a58caa3" containerID="6b4c3aa3af8bb3ba7ad7acd1bbf56d28d521443eb83800a6014f78288a017ca9" exitCode=0 Apr 21 10:07:45.892891 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:45.892565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" event={"ID":"087fe162-4bd2-4285-92a8-117f3a58caa3","Type":"ContainerDied","Data":"6b4c3aa3af8bb3ba7ad7acd1bbf56d28d521443eb83800a6014f78288a017ca9"} Apr 21 10:07:45.892933 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:45.892897 2573 scope.go:117] "RemoveContainer" containerID="6b4c3aa3af8bb3ba7ad7acd1bbf56d28d521443eb83800a6014f78288a017ca9" Apr 21 10:07:46.897936 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:46.897904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jwp2j" event={"ID":"087fe162-4bd2-4285-92a8-117f3a58caa3","Type":"ContainerStarted","Data":"90d264ec1270d580e56f135cc93e472fa03f8eb98abd1204e17ff6eb337c62bb"} Apr 21 10:07:54.056546 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.056492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:07:54.058863 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.058832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a931daa8-594d-442d-b462-5f77532314a5-metrics-certs\") pod \"network-metrics-daemon-g9jwm\" (UID: \"a931daa8-594d-442d-b462-5f77532314a5\") " pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:07:54.091111 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.091072 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:07:54.098841 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.098812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g9jwm" Apr 21 10:07:54.223686 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.223662 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g9jwm"] Apr 21 10:07:54.226476 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:07:54.226444 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda931daa8_594d_442d_b462_5f77532314a5.slice/crio-ed384b8fa2bbb9331d262cdc9d4ebefb8ff32e237a80fdaab330043600f54723 WatchSource:0}: Error finding container ed384b8fa2bbb9331d262cdc9d4ebefb8ff32e237a80fdaab330043600f54723: Status 404 returned error can't find the container with id ed384b8fa2bbb9331d262cdc9d4ebefb8ff32e237a80fdaab330043600f54723 Apr 21 10:07:54.921880 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:54.921842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g9jwm" event={"ID":"a931daa8-594d-442d-b462-5f77532314a5","Type":"ContainerStarted","Data":"ed384b8fa2bbb9331d262cdc9d4ebefb8ff32e237a80fdaab330043600f54723"} Apr 21 10:07:55.925965 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:55.925930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g9jwm" event={"ID":"a931daa8-594d-442d-b462-5f77532314a5","Type":"ContainerStarted","Data":"785ac000c8e3dd0ae5224f5ed6746d3ff221623799184124a6fdd156ebb1cb65"} Apr 21 10:07:55.925965 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:55.925963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g9jwm" event={"ID":"a931daa8-594d-442d-b462-5f77532314a5","Type":"ContainerStarted","Data":"7ed2dcb6e0bc369e1358eff81030817f849bc65bfd7507461137235dcb83cee4"} Apr 21 10:07:55.942075 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:07:55.942023 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g9jwm" podStartSLOduration=252.964337874 podStartE2EDuration="4m13.942008833s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:07:54.228734479 +0000 UTC m=+252.599535184" lastFinishedPulling="2026-04-21 10:07:55.206405451 +0000 UTC m=+253.577206143" observedRunningTime="2026-04-21 10:07:55.940303494 +0000 UTC m=+254.311104203" watchObservedRunningTime="2026-04-21 10:07:55.942008833 +0000 UTC m=+254.312809521" Apr 21 10:08:03.548071 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:03.548031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:03.566640 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:03.566612 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:03.968510 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:03.968435 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:17.294862 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.294826 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296110 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="alertmanager" containerID="cri-o://4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" gracePeriod=120 Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296167 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-metric" containerID="cri-o://c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" gracePeriod=120 Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296187 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy" containerID="cri-o://e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" gracePeriod=120 Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296237 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="config-reloader" containerID="cri-o://9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" gracePeriod=120 Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296190 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-web" containerID="cri-o://b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" gracePeriod=120 Apr 21 10:08:17.296320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.296234 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="prom-label-proxy" containerID="cri-o://167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" gracePeriod=120 Apr 21 10:08:17.993286 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993250 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" exitCode=0 Apr 21 10:08:17.993286 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993277 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" exitCode=0 Apr 21 10:08:17.993286 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993284 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" exitCode=0 Apr 21 10:08:17.993286 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993290 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" exitCode=0 Apr 21 10:08:17.993561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8"} Apr 21 10:08:17.993561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8"} Apr 21 10:08:17.993561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f"} Apr 21 10:08:17.993561 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:17.993377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a"} Apr 21 10:08:18.537217 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.537193 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:18.668228 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668122 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvlm7\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668228 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668181 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668228 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668218 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668228 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668234 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668254 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668273 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668325 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668382 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668413 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668446 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668486 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668515 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca\") pod \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\" (UID: \"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780\") " Apr 21 10:08:18.668665 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668648 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:08:18.669131 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668933 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-main-db\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.669131 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.668981 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:18.671200 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.671200 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671194 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7" (OuterVolumeSpecName: "kube-api-access-nvlm7") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "kube-api-access-nvlm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:08:18.671463 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671202 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.671463 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671350 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.671635 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671591 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:18.671692 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671646 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:08:18.671822 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671802 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out" (OuterVolumeSpecName: "config-out") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:08:18.671980 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.671961 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.672967 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.672945 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.676320 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.676298 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.681911 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.681891 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config" (OuterVolumeSpecName: "web-config") pod "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" (UID: "bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:18.770335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770302 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvlm7\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-kube-api-access-nvlm7\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770335 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770337 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770349 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-tls-assets\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770359 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770368 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-web-config\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770377 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-main-tls\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770386 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770395 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-out\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770403 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-cluster-tls-config\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770412 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-config-volume\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770422 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.770499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.770430 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780-metrics-client-ca\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:18.998710 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998678 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" exitCode=0 Apr 21 10:08:18.998710 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998703 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerID="b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" exitCode=0 Apr 21 10:08:18.998902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665"} Apr 21 10:08:18.998902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91"} Apr 21 10:08:18.998902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780","Type":"ContainerDied","Data":"482eb58361898789dc14397c64bd3af7e4df56e511d6c6c55812810229d42bfa"} Apr 21 10:08:18.998902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998808 2573 scope.go:117] "RemoveContainer" containerID="167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" Apr 21 10:08:18.998902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:18.998838 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.007706 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.007686 2573 scope.go:117] "RemoveContainer" containerID="c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" Apr 21 10:08:19.014790 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.014771 2573 scope.go:117] "RemoveContainer" containerID="e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" Apr 21 10:08:19.021242 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.021223 2573 scope.go:117] "RemoveContainer" containerID="b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" Apr 21 10:08:19.023138 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.023115 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:19.029328 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.029311 2573 scope.go:117] "RemoveContainer" containerID="9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" Apr 21 10:08:19.029626 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.029604 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:19.035723 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.035703 2573 scope.go:117] "RemoveContainer" containerID="4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" Apr 21 10:08:19.042188 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.042168 2573 scope.go:117] "RemoveContainer" containerID="8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89" Apr 21 10:08:19.049226 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.049206 2573 scope.go:117] "RemoveContainer" containerID="167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" Apr 21 10:08:19.049513 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.049492 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8\": container with ID starting with 167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8 not found: ID does not exist" containerID="167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" Apr 21 10:08:19.049617 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.049527 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8"} err="failed to get container status \"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8\": rpc error: code = NotFound desc = could not find container \"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8\": container with ID starting with 167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8 not found: ID does not exist" Apr 21 10:08:19.049617 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.049590 2573 scope.go:117] "RemoveContainer" containerID="c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" Apr 21 10:08:19.049843 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.049825 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665\": container with ID starting with c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665 not found: ID does not exist" containerID="c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" Apr 21 10:08:19.049879 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.049851 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665"} err="failed to get container status \"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665\": rpc error: code = NotFound desc = could not find container \"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665\": container with ID starting with c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665 not found: ID does not exist" Apr 21 10:08:19.049879 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.049867 2573 scope.go:117] "RemoveContainer" containerID="e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" Apr 21 10:08:19.050060 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.050042 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8\": container with ID starting with e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8 not found: ID does not exist" containerID="e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" Apr 21 10:08:19.050122 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050067 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8"} err="failed to get container status \"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8\": rpc error: code = NotFound desc = could not find container \"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8\": container with ID starting with e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8 not found: ID does not exist" Apr 21 10:08:19.050122 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050089 2573 scope.go:117] "RemoveContainer" containerID="b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" Apr 21 10:08:19.050292 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.050271 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91\": container with ID starting with b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91 not found: ID does not exist" containerID="b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" Apr 21 10:08:19.050399 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050293 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91"} err="failed to get container status \"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91\": rpc error: code = NotFound desc = could not find container \"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91\": container with ID starting with b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91 not found: ID does not exist" Apr 21 10:08:19.050399 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050306 2573 scope.go:117] "RemoveContainer" containerID="9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" Apr 21 10:08:19.050571 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.050552 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f\": container with ID starting with 9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f not found: ID does not exist" containerID="9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" Apr 21 10:08:19.050648 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050580 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f"} err="failed to get container status \"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f\": rpc error: code = NotFound desc = could not find container \"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f\": container with ID starting with 9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f not found: ID does not exist" Apr 21 10:08:19.050648 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050602 2573 scope.go:117] "RemoveContainer" containerID="4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" Apr 21 10:08:19.050829 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.050810 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a\": container with ID starting with 4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a not found: ID does not exist" containerID="4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" Apr 21 10:08:19.050866 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050832 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a"} err="failed to get container status \"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a\": rpc error: code = NotFound desc = could not find container \"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a\": container with ID starting with 4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a not found: ID does not exist" Apr 21 10:08:19.050866 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.050844 2573 scope.go:117] "RemoveContainer" containerID="8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89" Apr 21 10:08:19.051048 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:19.051032 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89\": container with ID starting with 8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89 not found: ID does not exist" containerID="8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89" Apr 21 10:08:19.051113 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051054 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89"} err="failed to get container status \"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89\": rpc error: code = NotFound desc = could not find container \"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89\": container with ID starting with 8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89 not found: ID does not exist" Apr 21 10:08:19.051113 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051073 2573 scope.go:117] "RemoveContainer" containerID="167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8" Apr 21 10:08:19.051269 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051245 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8"} err="failed to get container status \"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8\": rpc error: code = NotFound desc = could not find container \"167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8\": container with ID starting with 167548f80ea67102d960c0bb376e766fb93083df9620ebe26fd870c13fc9a2e8 not found: ID does not exist" Apr 21 10:08:19.051315 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051269 2573 scope.go:117] "RemoveContainer" containerID="c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665" Apr 21 10:08:19.051436 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051421 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665"} err="failed to get container status \"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665\": rpc error: code = NotFound desc = could not find container \"c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665\": container with ID starting with c15822702f079fd7b105caeb00d14904c746ad4bc6e18bbafffcaadac755e665 not found: ID does not exist" Apr 21 10:08:19.051481 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051435 2573 scope.go:117] "RemoveContainer" containerID="e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8" Apr 21 10:08:19.051666 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051649 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8"} err="failed to get container status \"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8\": rpc error: code = NotFound desc = could not find container \"e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8\": container with ID starting with e263620f114f73a170937b9e3db9f4787a62a630015a4cf0175fef3f421aada8 not found: ID does not exist" Apr 21 10:08:19.051706 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051667 2573 scope.go:117] "RemoveContainer" containerID="b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91" Apr 21 10:08:19.051870 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051853 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91"} err="failed to get container status \"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91\": rpc error: code = NotFound desc = could not find container \"b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91\": container with ID starting with b2e3af18bcd09eeb903f08ee40b67b474947821dd8a2657093e4a32702eedf91 not found: ID does not exist" Apr 21 10:08:19.051910 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.051871 2573 scope.go:117] "RemoveContainer" containerID="9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f" Apr 21 10:08:19.052091 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.052075 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f"} err="failed to get container status \"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f\": rpc error: code = NotFound desc = could not find container \"9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f\": container with ID starting with 9134015dc9e171d7423f67fe8f311c3f6fb571e45d3f0dfd006606e574413c8f not found: ID does not exist" Apr 21 10:08:19.052091 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.052090 2573 scope.go:117] "RemoveContainer" containerID="4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a" Apr 21 10:08:19.052251 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.052235 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a"} err="failed to get container status \"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a\": rpc error: code = NotFound desc = could not find container \"4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a\": container with ID starting with 4e2a2710ae27f3ddfaebc42db03a613d53fe4300156736e1858debb004a0579a not found: ID does not exist" Apr 21 10:08:19.052291 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.052252 2573 scope.go:117] "RemoveContainer" containerID="8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89" Apr 21 10:08:19.052417 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.052398 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89"} err="failed to get container status \"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89\": rpc error: code = NotFound desc = could not find container \"8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89\": container with ID starting with 8b529eba91d3944642fd662ca68d9df40a0ca89c0551a6bd6949606c5dc25e89 not found: ID does not exist" Apr 21 10:08:19.056406 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056385 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:19.056679 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056665 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-metric" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056680 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-metric" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056689 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-web" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056695 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-web" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056703 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="alertmanager" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056709 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="alertmanager" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056716 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056723 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy" Apr 21 10:08:19.056731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056733 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="init-config-reloader" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056739 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="init-config-reloader" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056745 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="prom-label-proxy" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056751 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="prom-label-proxy" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056758 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="config-reloader" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056763 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="config-reloader" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056806 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056813 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="prom-label-proxy" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056820 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="config-reloader" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056826 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-web" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056833 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="alertmanager" Apr 21 10:08:19.056983 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.056840 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" containerName="kube-rbac-proxy-metric" Apr 21 10:08:19.060590 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.060575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.063050 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063033 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 10:08:19.063148 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063047 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 10:08:19.063148 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063071 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 10:08:19.063148 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063110 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 10:08:19.063148 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063143 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 10:08:19.063363 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063319 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 10:08:19.063674 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063656 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 10:08:19.063764 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.063710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 10:08:19.064135 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.064118 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6v8sh\"" Apr 21 10:08:19.068974 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.068954 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 10:08:19.072979 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.072958 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:19.174052 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174013 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174052 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174257 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174257 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174257 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174257 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174381 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-web-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174381 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-out\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174381 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174381 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174512 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m9b\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-kube-api-access-q4m9b\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174512 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.174512 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.174434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.274985 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.274884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.274985 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.274923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.274985 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.274965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-web-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-out\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m9b\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-kube-api-access-q4m9b\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275279 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.275649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.275427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.276155 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.276130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.276746 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.276385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.276849 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.276749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278095 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-web-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278278 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-out\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278359 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278423 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278844 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.278913 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.278887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.279157 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.279133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.279275 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.279249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.279810 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.279790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.283474 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.283450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m9b\" (UniqueName: \"kubernetes.io/projected/1ad7e63d-69fd-4355-afcb-eb1adfe55dc4-kube-api-access-q4m9b\") pod \"alertmanager-main-0\" (UID: \"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.370259 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.370222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:08:19.499158 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:19.499120 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:08:19.501624 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:19.501597 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad7e63d_69fd_4355_afcb_eb1adfe55dc4.slice/crio-6b7f3ca24c905f047578ba97acc23d782794ce66b07c7f9ace2c2bdfb783c472 WatchSource:0}: Error finding container 6b7f3ca24c905f047578ba97acc23d782794ce66b07c7f9ace2c2bdfb783c472: Status 404 returned error can't find the container with id 6b7f3ca24c905f047578ba97acc23d782794ce66b07c7f9ace2c2bdfb783c472 Apr 21 10:08:20.003446 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:20.003414 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ad7e63d-69fd-4355-afcb-eb1adfe55dc4" containerID="855c17b88877ad9753a21401e5ffa52d2d285c770068a6590acc6ef7c6b589a5" exitCode=0 Apr 21 10:08:20.003906 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:20.003508 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerDied","Data":"855c17b88877ad9753a21401e5ffa52d2d285c770068a6590acc6ef7c6b589a5"} Apr 21 10:08:20.003906 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:20.003572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"6b7f3ca24c905f047578ba97acc23d782794ce66b07c7f9ace2c2bdfb783c472"} Apr 21 10:08:20.193188 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:20.193157 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780" path="/var/lib/kubelet/pods/bf7f0eb5-4d61-4bc5-8eb8-7be3eec9c780/volumes" Apr 21 10:08:21.009912 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"5cf900772be85ebf4d7505df9dbf520e488401d65c8f87757022f56159a3aad3"} Apr 21 10:08:21.009912 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"a39e0a320adc5e77c72fd63fc98069ebdff2b6b3763d73beee2f022f748590c0"} Apr 21 10:08:21.010435 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"20fbff32495eff9b065b7a62fb2994338e79c83328a8aa0dfc33d9ca9d1d48a2"} Apr 21 10:08:21.010435 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"c239723b07423491ecebb6ca5d1b6a546ead11e11b842c538b0538035e8f4a9e"} Apr 21 10:08:21.010435 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009948 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"04d498324d5f5923689c5ea2e3b3a90990812f0afe2ed4a376090ab59d7139a4"} Apr 21 10:08:21.010435 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.009956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1ad7e63d-69fd-4355-afcb-eb1adfe55dc4","Type":"ContainerStarted","Data":"6198ba1eebca895e02f8c50a189e6a80941c454e88181e67169674e31f613104"} Apr 21 10:08:21.043991 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.043817 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.043797237 podStartE2EDuration="2.043797237s" podCreationTimestamp="2026-04-21 10:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:08:21.043518955 +0000 UTC m=+279.414319683" watchObservedRunningTime="2026-04-21 10:08:21.043797237 +0000 UTC m=+279.414597946" Apr 21 10:08:21.368036 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.367990 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b4b858b84-8kgml"] Apr 21 10:08:21.370496 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.370477 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.372797 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.372773 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-htw47\"" Apr 21 10:08:21.373181 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.373169 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 10:08:21.373370 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.373358 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 10:08:21.373436 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.373420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 10:08:21.373494 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.373456 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 10:08:21.373786 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.373770 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 10:08:21.382916 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.382886 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 10:08:21.387076 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.387057 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b4b858b84-8kgml"] Apr 21 10:08:21.497799 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstpw\" (UniqueName: \"kubernetes.io/projected/294e1d79-5934-4d4d-b1c2-65706662f756-kube-api-access-rstpw\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.497994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.497994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-serving-certs-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.497994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.497994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-metrics-client-ca\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.497994 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.497951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-federate-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.498158 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.498024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.498158 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.498066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599005 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.598971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rstpw\" (UniqueName: \"kubernetes.io/projected/294e1d79-5934-4d4d-b1c2-65706662f756-kube-api-access-rstpw\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599005 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-serving-certs-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-metrics-client-ca\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599086 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-federate-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.599238 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.599182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.600089 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.600057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-serving-certs-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.600236 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.600083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-metrics-client-ca\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.600236 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.600159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.601822 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.601795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-telemeter-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.601892 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.601871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.601977 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.601960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-secret-telemeter-client\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.602024 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.602006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/294e1d79-5934-4d4d-b1c2-65706662f756-federate-client-tls\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.607827 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.607805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstpw\" (UniqueName: \"kubernetes.io/projected/294e1d79-5934-4d4d-b1c2-65706662f756-kube-api-access-rstpw\") pod \"telemeter-client-5b4b858b84-8kgml\" (UID: \"294e1d79-5934-4d4d-b1c2-65706662f756\") " pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.654106 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654016 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:21.654528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654477 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="prometheus" containerID="cri-o://df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" gracePeriod=600 Apr 21 10:08:21.654528 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654516 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="thanos-sidecar" containerID="cri-o://f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" gracePeriod=600 Apr 21 10:08:21.654727 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654574 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="config-reloader" containerID="cri-o://fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" gracePeriod=600 Apr 21 10:08:21.654727 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654494 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy" containerID="cri-o://710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" gracePeriod=600 Apr 21 10:08:21.654727 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654675 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-web" containerID="cri-o://6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" gracePeriod=600 Apr 21 10:08:21.654878 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.654732 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-thanos" containerID="cri-o://55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" gracePeriod=600 Apr 21 10:08:21.684174 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.684147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" Apr 21 10:08:21.815974 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.815945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b4b858b84-8kgml"] Apr 21 10:08:21.817835 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:21.817807 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294e1d79_5934_4d4d_b1c2_65706662f756.slice/crio-389fdcadb5726186dcf358825a4b7dbdc4f20018abf775d75655f5eef4ecb9b6 WatchSource:0}: Error finding container 389fdcadb5726186dcf358825a4b7dbdc4f20018abf775d75655f5eef4ecb9b6: Status 404 returned error can't find the container with id 389fdcadb5726186dcf358825a4b7dbdc4f20018abf775d75655f5eef4ecb9b6 Apr 21 10:08:21.898632 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:21.898608 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.003116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003019 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003063 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003095 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003114 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003135 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003162 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003195 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003224 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003266 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003292 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003324 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003365 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003391 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003456 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003443 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003972 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003475 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003972 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003512 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003972 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003617 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbrdv\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.003972 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003677 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs\") pod \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\" (UID: \"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0\") " Apr 21 10:08:22.005204 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003611 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:22.005332 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.003645 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:22.005332 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.004712 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:22.005332 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.004827 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:22.005915 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.005880 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:22.006122 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.006095 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:08:22.006822 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.006792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.006918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.006854 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.007509 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.007488 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv" (OuterVolumeSpecName: "kube-api-access-sbrdv") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "kube-api-access-sbrdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:08:22.007917 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.007864 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.007917 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.007866 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:08:22.008078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.007929 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.008078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.007957 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config" (OuterVolumeSpecName: "config") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.008078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.008010 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.008234 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.008173 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.008288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.008264 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out" (OuterVolumeSpecName: "config-out") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:08:22.009197 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.009179 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.015830 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015804 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" exitCode=0 Apr 21 10:08:22.015830 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015827 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" exitCode=0 Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015834 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" exitCode=0 Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015841 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" exitCode=0 Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015846 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" exitCode=0 Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015851 2573 generic.go:358] "Generic (PLEG): container finished" podID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" exitCode=0 Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015908 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015998 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.015999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} Apr 21 10:08:22.016220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.016105 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0","Type":"ContainerDied","Data":"8eac4b4ff20b54519a1ec76fff3ff2a0e9ee6a1c582932746161ca494ba1a904"} Apr 21 10:08:22.017116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.017093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" event={"ID":"294e1d79-5934-4d4d-b1c2-65706662f756","Type":"ContainerStarted","Data":"389fdcadb5726186dcf358825a4b7dbdc4f20018abf775d75655f5eef4ecb9b6"} Apr 21 10:08:22.017709 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.017684 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config" (OuterVolumeSpecName: "web-config") pod "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" (UID: "1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:22.023075 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.023060 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.029474 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.029457 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.036007 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.035979 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.041996 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.041979 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.048207 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.048192 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.054740 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.054724 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.060460 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.060444 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.060719 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.060701 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.060769 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.060727 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.060769 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.060745 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.060938 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.060924 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.060976 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.060941 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.060976 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.060954 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.061188 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.061170 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.061232 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061194 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.061232 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061216 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.061406 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.061388 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.061442 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061406 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.061442 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061417 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.061652 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.061622 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.061730 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061656 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.061730 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061671 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.061904 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.061889 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.061947 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061907 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.061947 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.061918 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.062147 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.062130 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.062190 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062151 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.062190 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062167 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.062374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062359 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.062420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062375 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.062605 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062587 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.062605 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062603 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.062819 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062796 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.062860 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.062820 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.063031 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063015 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.063080 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063031 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.063227 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063212 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.063282 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063228 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.063436 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063418 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.063475 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063437 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.063664 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063632 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.063715 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063666 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.063885 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063863 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.063885 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.063883 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.064070 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064056 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.064070 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064070 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.064277 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064251 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.064277 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064272 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.064452 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064433 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.064498 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064454 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.064712 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064694 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.064775 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064713 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.064913 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064897 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.064957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.064914 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.065135 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065117 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.065198 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065137 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.065353 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065337 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.065418 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065356 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.065613 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065598 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.065675 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065614 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.065859 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065838 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.065898 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.065860 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.066062 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066044 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.066104 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066063 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.066298 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066278 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.066356 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066299 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.066516 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066500 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.066568 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066517 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.066763 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066746 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.066803 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066764 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.066992 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066974 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.067034 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.066993 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.067183 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067169 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.067226 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067184 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.067380 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067361 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.067444 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067383 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.067592 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067576 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.067640 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067593 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.067744 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067729 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.067789 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067744 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.067951 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067934 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.067998 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.067952 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.068194 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068179 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.068194 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068193 2573 scope.go:117] "RemoveContainer" containerID="55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30" Apr 21 10:08:22.068393 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068379 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30"} err="failed to get container status \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": rpc error: code = NotFound desc = could not find container \"55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30\": container with ID starting with 55debbf64629330479916e3e19a3049ebb61272406b3c47cc2d2f6cc1375ae30 not found: ID does not exist" Apr 21 10:08:22.068433 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068393 2573 scope.go:117] "RemoveContainer" containerID="710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819" Apr 21 10:08:22.068664 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068648 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819"} err="failed to get container status \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": rpc error: code = NotFound desc = could not find container \"710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819\": container with ID starting with 710c7fa46d6ab0ad7ef27dd8ef96542e30a603ecc149263ba3294c589af80819 not found: ID does not exist" Apr 21 10:08:22.068709 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068667 2573 scope.go:117] "RemoveContainer" containerID="6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d" Apr 21 10:08:22.068863 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068844 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d"} err="failed to get container status \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": rpc error: code = NotFound desc = could not find container \"6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d\": container with ID starting with 6505cf6b96beaa78f50376c94e295b9a31e3fde81d67d27c1f58faed27d5801d not found: ID does not exist" Apr 21 10:08:22.068902 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.068863 2573 scope.go:117] "RemoveContainer" containerID="f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc" Apr 21 10:08:22.069047 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069033 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc"} err="failed to get container status \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": rpc error: code = NotFound desc = could not find container \"f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc\": container with ID starting with f95d2fd05a95c7e1db05edfd62f5fb5feb80a96ba63cb1cc34841f7d58776fcc not found: ID does not exist" Apr 21 10:08:22.069094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069047 2573 scope.go:117] "RemoveContainer" containerID="fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150" Apr 21 10:08:22.069216 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069202 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150"} err="failed to get container status \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": rpc error: code = NotFound desc = could not find container \"fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150\": container with ID starting with fd8eddd5698da3871a8cbcd4217f705480918705776037e6952295c19b85d150 not found: ID does not exist" Apr 21 10:08:22.069260 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069217 2573 scope.go:117] "RemoveContainer" containerID="df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a" Apr 21 10:08:22.069446 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069417 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a"} err="failed to get container status \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": rpc error: code = NotFound desc = could not find container \"df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a\": container with ID starting with df437c1048ec7e32588e7221d5b6f4bccca7017b337863fd802bd1cc852a843a not found: ID does not exist" Apr 21 10:08:22.069487 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069447 2573 scope.go:117] "RemoveContainer" containerID="0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6" Apr 21 10:08:22.069695 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.069676 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6"} err="failed to get container status \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": rpc error: code = NotFound desc = could not find container \"0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6\": container with ID starting with 0cf626da2e122e10a38a1491e49414e5833924287371903a72a2dfa3cf1aa7f6 not found: ID does not exist" Apr 21 10:08:22.104711 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104685 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-metrics-client-certs\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104711 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104709 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104720 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-web-config\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104729 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104739 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config-out\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104749 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-kube-rbac-proxy\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104760 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104769 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-grpc-tls\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104779 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104788 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-config\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104796 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104807 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104816 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-configmap-metrics-client-ca\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104824 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-tls-assets\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104832 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104840 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-db\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104849 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.104854 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.104857 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbrdv\" (UniqueName: \"kubernetes.io/projected/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0-kube-api-access-sbrdv\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:08:22.333637 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.333603 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:22.341265 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.341227 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:22.368855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.368820 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:22.369249 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369234 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="init-config-reloader" Apr 21 10:08:22.369288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369253 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="init-config-reloader" Apr 21 10:08:22.369288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369270 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-thanos" Apr 21 10:08:22.369288 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369279 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-thanos" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369293 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="config-reloader" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369303 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="config-reloader" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369315 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-web" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369323 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-web" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369337 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369345 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369356 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="thanos-sidecar" Apr 21 10:08:22.369374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369365 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="thanos-sidecar" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369381 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="prometheus" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369391 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="prometheus" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369454 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369465 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="prometheus" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369476 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-thanos" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369488 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="thanos-sidecar" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369497 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="kube-rbac-proxy-web" Apr 21 10:08:22.369619 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.369507 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" containerName="config-reloader" Apr 21 10:08:22.373771 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.373754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.376672 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.376649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 10:08:22.376773 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.376688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 10:08:22.376773 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.376741 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 10:08:22.376892 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.376877 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4656c7cvjvu8f\"" Apr 21 10:08:22.377013 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.376997 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 10:08:22.377086 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377071 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 10:08:22.377134 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 10:08:22.377180 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377129 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 10:08:22.377588 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 10:08:22.377682 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377666 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qdzph\"" Apr 21 10:08:22.377731 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.377719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 10:08:22.378029 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.378005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 10:08:22.379618 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.379599 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 10:08:22.382470 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.382453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 10:08:22.388920 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.388890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:22.508224 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508224 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config-out\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508422 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-web-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbmdw\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-kube-api-access-zbmdw\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508603 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508796 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508796 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508796 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508662 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.508796 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.508678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.609685 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.609685 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.609685 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.609685 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config-out\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610000 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.609965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.610009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.610038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-web-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.610069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.610106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.610420 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.610150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbmdw\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-kube-api-access-zbmdw\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.611142 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.611013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.614759 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.613816 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.614759 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.614153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.614759 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.614413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.614759 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.614471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.614759 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.614568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-config-out\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.615116 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.614890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.615173 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.615130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.615749 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.615439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.616058 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.615920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.616058 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.616022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.616850 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.616807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ac8451-a21e-41d5-bd03-d1daebae4cdf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.617171 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.617101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.618113 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.617973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.618215 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.618195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.619051 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.619019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.619293 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.619271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbmdw\" (UniqueName: \"kubernetes.io/projected/92ac8451-a21e-41d5-bd03-d1daebae4cdf-kube-api-access-zbmdw\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.621740 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.621679 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" podUID="1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0" Apr 21 10:08:22.621740 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:22.621681 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bqc7r" podUID="2a4340c5-5a53-4cd3-b487-d469b4bb82c5" Apr 21 10:08:22.621882 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.621799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92ac8451-a21e-41d5-bd03-d1daebae4cdf-web-config\") pod \"prometheus-k8s-0\" (UID: \"92ac8451-a21e-41d5-bd03-d1daebae4cdf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.684705 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.684669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:22.828291 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:22.828080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:08:22.829182 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:22.829149 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ac8451_a21e_41d5_bd03_d1daebae4cdf.slice/crio-ee14ffe85f699580728f4091e303964d7a18d2cb00bd786224f27f23a09090dd WatchSource:0}: Error finding container ee14ffe85f699580728f4091e303964d7a18d2cb00bd786224f27f23a09090dd: Status 404 returned error can't find the container with id ee14ffe85f699580728f4091e303964d7a18d2cb00bd786224f27f23a09090dd Apr 21 10:08:23.022755 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:23.022724 2573 generic.go:358] "Generic (PLEG): container finished" podID="92ac8451-a21e-41d5-bd03-d1daebae4cdf" containerID="9e70bf42b4156d04dde0e64c5b7e9dda1584946f779f9623b5f1389f0f003ace" exitCode=0 Apr 21 10:08:23.023220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:23.022816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerDied","Data":"9e70bf42b4156d04dde0e64c5b7e9dda1584946f779f9623b5f1389f0f003ace"} Apr 21 10:08:23.023220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:23.022839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:23.023220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:23.022857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"ee14ffe85f699580728f4091e303964d7a18d2cb00bd786224f27f23a09090dd"} Apr 21 10:08:23.023220 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:23.022978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:08:23.624945 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:08:23.624911 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cd55l" podUID="dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1" Apr 21 10:08:24.027800 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.027756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" event={"ID":"294e1d79-5934-4d4d-b1c2-65706662f756","Type":"ContainerStarted","Data":"77e211f9fd3a50f907f86ec3896be271099da3ec05f60c28926860786b62dd3a"} Apr 21 10:08:24.027800 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.027801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" event={"ID":"294e1d79-5934-4d4d-b1c2-65706662f756","Type":"ContainerStarted","Data":"e84eda9bb6882c9052fe62d4f2be8f520eba960c39952487163901a90e9866fb"} Apr 21 10:08:24.028263 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.027816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" event={"ID":"294e1d79-5934-4d4d-b1c2-65706662f756","Type":"ContainerStarted","Data":"e9970073a8853975ac6247c49201a51c3b8305fecbeaba93189a856fd55edb0b"} Apr 21 10:08:24.030584 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"9b34dd6d7a24b53d24e35d1f62afb1054c1e97dc9765c375dc1b028a9d5a74fb"} Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"bc76db9fdf120308e7d3f74b716b577105de2e8ee7387dc4ce5bdd9844da6a7f"} Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030620 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"86f9bf2d030ab02739b226aef851b10993ca5ca13e1106d25d9923d1e0e50843"} Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"2f6597bd705faf093523a89526c3e6480f91810888bb5c3e66b49287286a409c"} Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"54c894c2a76ec40046ca2d5b3de4c4bb67c0083576c9d544b535c7d1b8fd0a5b"} Apr 21 10:08:24.030681 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.030651 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92ac8451-a21e-41d5-bd03-d1daebae4cdf","Type":"ContainerStarted","Data":"de9be73977ae59e40af1b4db456c4b7910fae19d18de0ad8b3f1f4930dd4e0bc"} Apr 21 10:08:24.058649 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.058593 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b4b858b84-8kgml" podStartSLOduration=1.1279952 podStartE2EDuration="3.058574157s" podCreationTimestamp="2026-04-21 10:08:21 +0000 UTC" firstStartedPulling="2026-04-21 10:08:21.819745922 +0000 UTC m=+280.190546609" lastFinishedPulling="2026-04-21 10:08:23.750324879 +0000 UTC m=+282.121125566" observedRunningTime="2026-04-21 10:08:24.050457354 +0000 UTC m=+282.421258063" watchObservedRunningTime="2026-04-21 10:08:24.058574157 +0000 UTC m=+282.429374866" Apr 21 10:08:24.087823 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.087769 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.087755377 podStartE2EDuration="2.087755377s" podCreationTimestamp="2026-04-21 10:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:08:24.08348454 +0000 UTC m=+282.454285248" watchObservedRunningTime="2026-04-21 10:08:24.087755377 +0000 UTC m=+282.458556087" Apr 21 10:08:24.191025 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:24.190944 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0" path="/var/lib/kubelet/pods/1a1374f6-cd56-4f0d-bff6-4f9d32a4b1f0/volumes" Apr 21 10:08:26.548846 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.548791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:08:26.551185 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.551150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9gkbz\" (UID: \"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:08:26.627142 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.627109 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-llcxp\"" Apr 21 10:08:26.635210 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.635188 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" Apr 21 10:08:26.649483 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.649454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:26.649655 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.649510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:08:26.651773 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.651748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a4340c5-5a53-4cd3-b487-d469b4bb82c5-metrics-tls\") pod \"dns-default-bqc7r\" (UID: \"2a4340c5-5a53-4cd3-b487-d469b4bb82c5\") " pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:26.651948 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.651920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1-cert\") pod \"ingress-canary-cd55l\" (UID: \"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1\") " pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:08:26.734666 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.734638 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:08:26.741950 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.741924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cd55l" Apr 21 10:08:26.755899 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.755873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz"] Apr 21 10:08:26.757999 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:26.757962 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e5525d0_e4ac_4ad3_8e4a_af53759f4ae0.slice/crio-8b1c74b69cde8fccb56096d6fcd76db3bd602594875f4a446f89942606f2d949 WatchSource:0}: Error finding container 8b1c74b69cde8fccb56096d6fcd76db3bd602594875f4a446f89942606f2d949: Status 404 returned error can't find the container with id 8b1c74b69cde8fccb56096d6fcd76db3bd602594875f4a446f89942606f2d949 Apr 21 10:08:26.876842 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.876815 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cd55l"] Apr 21 10:08:26.879433 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:26.879404 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf4e3a0_23bd_4b3c_a708_c10c960ca6c1.slice/crio-232980d9eee92f0cd82fa4d78b93f8826b7692cb0488f2b7438270736623b848 WatchSource:0}: Error finding container 232980d9eee92f0cd82fa4d78b93f8826b7692cb0488f2b7438270736623b848: Status 404 returned error can't find the container with id 232980d9eee92f0cd82fa4d78b93f8826b7692cb0488f2b7438270736623b848 Apr 21 10:08:26.926736 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.926707 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:08:26.934881 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:26.934859 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:27.040491 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:27.040454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cd55l" event={"ID":"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1","Type":"ContainerStarted","Data":"232980d9eee92f0cd82fa4d78b93f8826b7692cb0488f2b7438270736623b848"} Apr 21 10:08:27.041494 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:27.041473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" event={"ID":"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0","Type":"ContainerStarted","Data":"8b1c74b69cde8fccb56096d6fcd76db3bd602594875f4a446f89942606f2d949"} Apr 21 10:08:27.052631 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:27.052490 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bqc7r"] Apr 21 10:08:27.055089 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:08:27.055063 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4340c5_5a53_4cd3_b487_d469b4bb82c5.slice/crio-9e9514e4d5f8a7ffb43fce44c84f0070f1c8cf4a06b531c6be0a319096d71366 WatchSource:0}: Error finding container 9e9514e4d5f8a7ffb43fce44c84f0070f1c8cf4a06b531c6be0a319096d71366: Status 404 returned error can't find the container with id 9e9514e4d5f8a7ffb43fce44c84f0070f1c8cf4a06b531c6be0a319096d71366 Apr 21 10:08:27.685254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:27.685195 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:28.046789 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:28.046745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" event={"ID":"1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0","Type":"ContainerStarted","Data":"8f26dceedbac935146bb88825583de73caab10a629e837991ab3b925d5976155"} Apr 21 10:08:28.047959 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:28.047935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bqc7r" event={"ID":"2a4340c5-5a53-4cd3-b487-d469b4bb82c5","Type":"ContainerStarted","Data":"9e9514e4d5f8a7ffb43fce44c84f0070f1c8cf4a06b531c6be0a319096d71366"} Apr 21 10:08:28.063195 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:28.063060 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9gkbz" podStartSLOduration=283.992980958 podStartE2EDuration="4m45.063042488s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="2026-04-21 10:08:26.762023597 +0000 UTC m=+285.132824284" lastFinishedPulling="2026-04-21 10:08:27.832085115 +0000 UTC m=+286.202885814" observedRunningTime="2026-04-21 10:08:28.062471355 +0000 UTC m=+286.433272065" watchObservedRunningTime="2026-04-21 10:08:28.063042488 +0000 UTC m=+286.433843201" Apr 21 10:08:29.052866 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:29.052760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cd55l" event={"ID":"dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1","Type":"ContainerStarted","Data":"5e32380e54c3376c2289f435880e47aaa382eeb04e789f7695125b092b0b1d79"} Apr 21 10:08:29.055306 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:29.055275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bqc7r" event={"ID":"2a4340c5-5a53-4cd3-b487-d469b4bb82c5","Type":"ContainerStarted","Data":"42b1d5f15a07919490a4f5872361bd71fba5a6d3e22822c796c0430f8306a4da"} Apr 21 10:08:30.060046 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:30.060008 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bqc7r" event={"ID":"2a4340c5-5a53-4cd3-b487-d469b4bb82c5","Type":"ContainerStarted","Data":"fed71e9ecfa3c49617343f8394605bac2b55bbd5e286dfd2f4a47983bea253ff"} Apr 21 10:08:30.081173 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:30.081124 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bqc7r" podStartSLOduration=252.266676909 podStartE2EDuration="4m14.081110171s" podCreationTimestamp="2026-04-21 10:04:16 +0000 UTC" firstStartedPulling="2026-04-21 10:08:27.056746514 +0000 UTC m=+285.427547200" lastFinishedPulling="2026-04-21 10:08:28.871179773 +0000 UTC m=+287.241980462" observedRunningTime="2026-04-21 10:08:30.079273162 +0000 UTC m=+288.450073871" watchObservedRunningTime="2026-04-21 10:08:30.081110171 +0000 UTC m=+288.451910919" Apr 21 10:08:30.081394 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:30.081374 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cd55l" podStartSLOduration=252.087425107 podStartE2EDuration="4m14.081368766s" podCreationTimestamp="2026-04-21 10:04:16 +0000 UTC" firstStartedPulling="2026-04-21 10:08:26.881260229 +0000 UTC m=+285.252060916" lastFinishedPulling="2026-04-21 10:08:28.875203888 +0000 UTC m=+287.246004575" observedRunningTime="2026-04-21 10:08:29.071133742 +0000 UTC m=+287.441934464" watchObservedRunningTime="2026-04-21 10:08:30.081368766 +0000 UTC m=+288.452169474" Apr 21 10:08:31.064345 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:31.064312 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:38.069470 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:38.069439 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bqc7r" Apr 21 10:08:42.146985 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:42.146957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:08:42.147471 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:42.147449 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:08:42.153637 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:08:42.153616 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:09:22.684901 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:09:22.684864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:09:22.700636 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:09:22.700607 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:09:23.227973 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:09:23.227946 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:12:27.250801 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.250711 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mdh49"] Apr 21 10:12:27.254070 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.254040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.256808 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.256758 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 21 10:12:27.257754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.257731 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 10:12:27.257863 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.257732 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 10:12:27.257863 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.257732 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-x97xp\"" Apr 21 10:12:27.263343 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.263315 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mdh49"] Apr 21 10:12:27.322641 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.322605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.322824 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.322657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqdk\" (UniqueName: \"kubernetes.io/projected/79736023-3e01-47ad-bef8-701841fb35ab-kube-api-access-vxqdk\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.423292 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.423248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.423292 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.423300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqdk\" (UniqueName: \"kubernetes.io/projected/79736023-3e01-47ad-bef8-701841fb35ab-kube-api-access-vxqdk\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.423596 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:12:27.423409 2573 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 21 10:12:27.423596 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:12:27.423500 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert podName:79736023-3e01-47ad-bef8-701841fb35ab nodeName:}" failed. No retries permitted until 2026-04-21 10:12:27.923478315 +0000 UTC m=+526.294279006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert") pod "llmisvc-controller-manager-68cc5db7c4-mdh49" (UID: "79736023-3e01-47ad-bef8-701841fb35ab") : secret "llmisvc-webhook-server-cert" not found Apr 21 10:12:27.441700 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.441667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqdk\" (UniqueName: \"kubernetes.io/projected/79736023-3e01-47ad-bef8-701841fb35ab-kube-api-access-vxqdk\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.926786 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.926752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:27.929103 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:27.929078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79736023-3e01-47ad-bef8-701841fb35ab-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mdh49\" (UID: \"79736023-3e01-47ad-bef8-701841fb35ab\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:28.166628 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:28.166590 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:28.290426 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:28.290391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mdh49"] Apr 21 10:12:28.293437 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:12:28.293407 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod79736023_3e01_47ad_bef8_701841fb35ab.slice/crio-97336eeeae859415e48e850913d0c29a91b1f775365d06852ee8f6238b639a3d WatchSource:0}: Error finding container 97336eeeae859415e48e850913d0c29a91b1f775365d06852ee8f6238b639a3d: Status 404 returned error can't find the container with id 97336eeeae859415e48e850913d0c29a91b1f775365d06852ee8f6238b639a3d Apr 21 10:12:28.294737 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:28.294717 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:12:28.745022 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:28.744985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" event={"ID":"79736023-3e01-47ad-bef8-701841fb35ab","Type":"ContainerStarted","Data":"97336eeeae859415e48e850913d0c29a91b1f775365d06852ee8f6238b639a3d"} Apr 21 10:12:30.752393 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:30.752353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" event={"ID":"79736023-3e01-47ad-bef8-701841fb35ab","Type":"ContainerStarted","Data":"abd7e05745ef68c831abf8ec16838668e7b9db40e01915732ec2f2d1bd31e46d"} Apr 21 10:12:30.752815 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:30.752509 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:12:30.768811 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:12:30.768766 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" podStartSLOduration=1.825374832 podStartE2EDuration="3.768751364s" podCreationTimestamp="2026-04-21 10:12:27 +0000 UTC" firstStartedPulling="2026-04-21 10:12:28.294837346 +0000 UTC m=+526.665638032" lastFinishedPulling="2026-04-21 10:12:30.238213869 +0000 UTC m=+528.609014564" observedRunningTime="2026-04-21 10:12:30.768030269 +0000 UTC m=+529.138830980" watchObservedRunningTime="2026-04-21 10:12:30.768751364 +0000 UTC m=+529.139552073" Apr 21 10:13:01.758069 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:01.758038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mdh49" Apr 21 10:13:42.171110 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:42.171082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:13:42.171708 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:42.171460 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:13:52.799865 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.799826 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-k2gbb"] Apr 21 10:13:52.802697 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.802678 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-k2gbb" Apr 21 10:13:52.805349 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.805321 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 10:13:52.805453 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.805323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mv2mf\"" Apr 21 10:13:52.809757 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.809473 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-k2gbb"] Apr 21 10:13:52.976021 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:52.975970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwtx\" (UniqueName: \"kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx\") pod \"s3-init-k2gbb\" (UID: \"88ae03a6-c387-4295-8629-d80af06c998b\") " pod="kserve/s3-init-k2gbb" Apr 21 10:13:53.077366 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:53.077274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwtx\" (UniqueName: \"kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx\") pod \"s3-init-k2gbb\" (UID: \"88ae03a6-c387-4295-8629-d80af06c998b\") " pod="kserve/s3-init-k2gbb" Apr 21 10:13:53.086480 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:53.086451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwtx\" (UniqueName: \"kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx\") pod \"s3-init-k2gbb\" (UID: \"88ae03a6-c387-4295-8629-d80af06c998b\") " pod="kserve/s3-init-k2gbb" Apr 21 10:13:53.127409 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:53.127369 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-k2gbb" Apr 21 10:13:53.250078 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:53.250037 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-k2gbb"] Apr 21 10:13:53.253271 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:13:53.253242 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ae03a6_c387_4295_8629_d80af06c998b.slice/crio-b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb WatchSource:0}: Error finding container b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb: Status 404 returned error can't find the container with id b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb Apr 21 10:13:54.000512 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:54.000472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-k2gbb" event={"ID":"88ae03a6-c387-4295-8629-d80af06c998b","Type":"ContainerStarted","Data":"b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb"} Apr 21 10:13:58.015305 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:58.015202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-k2gbb" event={"ID":"88ae03a6-c387-4295-8629-d80af06c998b","Type":"ContainerStarted","Data":"b2c37593c88a287e8fa02444b6fbb7ab9147cf93929444f65e3a12c533d9f292"} Apr 21 10:13:58.031018 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:13:58.030956 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-k2gbb" podStartSLOduration=1.673400026 podStartE2EDuration="6.030938738s" podCreationTimestamp="2026-04-21 10:13:52 +0000 UTC" firstStartedPulling="2026-04-21 10:13:53.255375373 +0000 UTC m=+611.626176061" lastFinishedPulling="2026-04-21 10:13:57.612914082 +0000 UTC m=+615.983714773" observedRunningTime="2026-04-21 10:13:58.030123386 +0000 UTC m=+616.400924097" watchObservedRunningTime="2026-04-21 10:13:58.030938738 +0000 UTC m=+616.401739450" Apr 21 10:14:01.026950 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:01.026859 2573 generic.go:358] "Generic (PLEG): container finished" podID="88ae03a6-c387-4295-8629-d80af06c998b" containerID="b2c37593c88a287e8fa02444b6fbb7ab9147cf93929444f65e3a12c533d9f292" exitCode=0 Apr 21 10:14:01.026950 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:01.026908 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-k2gbb" event={"ID":"88ae03a6-c387-4295-8629-d80af06c998b","Type":"ContainerDied","Data":"b2c37593c88a287e8fa02444b6fbb7ab9147cf93929444f65e3a12c533d9f292"} Apr 21 10:14:02.157644 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:02.157617 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-k2gbb" Apr 21 10:14:02.260978 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:02.260940 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvwtx\" (UniqueName: \"kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx\") pod \"88ae03a6-c387-4295-8629-d80af06c998b\" (UID: \"88ae03a6-c387-4295-8629-d80af06c998b\") " Apr 21 10:14:02.263145 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:02.263121 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx" (OuterVolumeSpecName: "kube-api-access-fvwtx") pod "88ae03a6-c387-4295-8629-d80af06c998b" (UID: "88ae03a6-c387-4295-8629-d80af06c998b"). InnerVolumeSpecName "kube-api-access-fvwtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:14:02.362476 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:02.362441 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvwtx\" (UniqueName: \"kubernetes.io/projected/88ae03a6-c387-4295-8629-d80af06c998b-kube-api-access-fvwtx\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:14:03.033805 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.033772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-k2gbb" event={"ID":"88ae03a6-c387-4295-8629-d80af06c998b","Type":"ContainerDied","Data":"b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb"} Apr 21 10:14:03.033805 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.033807 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6de8fe29f205d7b1e66ed428dbedef7d99d75c61340b33c1019d37a98e2aefb" Apr 21 10:14:03.034029 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.033781 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-k2gbb" Apr 21 10:14:03.738061 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.738029 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:03.738430 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.738415 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88ae03a6-c387-4295-8629-d80af06c998b" containerName="s3-init" Apr 21 10:14:03.738473 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.738432 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ae03a6-c387-4295-8629-d80af06c998b" containerName="s3-init" Apr 21 10:14:03.738506 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.738493 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="88ae03a6-c387-4295-8629-d80af06c998b" containerName="s3-init" Apr 21 10:14:03.741373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.741355 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.743985 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.743961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 21 10:14:03.744094 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.744059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mv2mf\"" Apr 21 10:14:03.748599 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.748575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:03.773833 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.773799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbrg\" (UniqueName: \"kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.773833 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.773843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.874855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.874814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbrg\" (UniqueName: \"kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.874855 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.874862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.875200 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.875184 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:03.883206 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:03.883174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbrg\" (UniqueName: \"kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg\") pod \"seaweedfs-tls-custom-ddd4dbfd-rvhts\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:04.052085 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:04.052060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:04.171498 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:04.171363 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:04.174383 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:14:04.174353 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7e3163_f7b1_4cdb_a2b2_b2e343572f91.slice/crio-70f14e6a098aebcce10279740b773e23e59f02a9b2df9fe6b8bba622b7b92a46 WatchSource:0}: Error finding container 70f14e6a098aebcce10279740b773e23e59f02a9b2df9fe6b8bba622b7b92a46: Status 404 returned error can't find the container with id 70f14e6a098aebcce10279740b773e23e59f02a9b2df9fe6b8bba622b7b92a46 Apr 21 10:14:05.040987 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:05.040949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" event={"ID":"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91","Type":"ContainerStarted","Data":"70f14e6a098aebcce10279740b773e23e59f02a9b2df9fe6b8bba622b7b92a46"} Apr 21 10:14:08.050777 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:08.050741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" event={"ID":"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91","Type":"ContainerStarted","Data":"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410"} Apr 21 10:14:08.069421 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:08.069372 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" podStartSLOduration=1.6910850960000001 podStartE2EDuration="5.069352279s" podCreationTimestamp="2026-04-21 10:14:03 +0000 UTC" firstStartedPulling="2026-04-21 10:14:04.175507062 +0000 UTC m=+622.546307750" lastFinishedPulling="2026-04-21 10:14:07.553774235 +0000 UTC m=+625.924574933" observedRunningTime="2026-04-21 10:14:08.067726856 +0000 UTC m=+626.438527577" watchObservedRunningTime="2026-04-21 10:14:08.069352279 +0000 UTC m=+626.440152989" Apr 21 10:14:08.842934 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:08.842896 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:10.056656 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:10.056617 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" podUID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" containerName="seaweedfs-tls-custom" containerID="cri-o://fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410" gracePeriod=30 Apr 21 10:14:38.291390 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.291366 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:38.489883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.489790 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data\") pod \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " Apr 21 10:14:38.489883 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.489864 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmbrg\" (UniqueName: \"kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg\") pod \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\" (UID: \"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91\") " Apr 21 10:14:38.491035 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.491011 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data" (OuterVolumeSpecName: "data") pod "ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" (UID: "ba7e3163-f7b1-4cdb-a2b2-b2e343572f91"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:14:38.491940 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.491921 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg" (OuterVolumeSpecName: "kube-api-access-mmbrg") pod "ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" (UID: "ba7e3163-f7b1-4cdb-a2b2-b2e343572f91"). InnerVolumeSpecName "kube-api-access-mmbrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:14:38.590714 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.590675 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmbrg\" (UniqueName: \"kubernetes.io/projected/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-kube-api-access-mmbrg\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:14:38.590714 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:38.590706 2573 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91-data\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:14:39.145003 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.144965 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" containerID="fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410" exitCode=0 Apr 21 10:14:39.145169 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.145053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" event={"ID":"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91","Type":"ContainerDied","Data":"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410"} Apr 21 10:14:39.145169 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.145088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" event={"ID":"ba7e3163-f7b1-4cdb-a2b2-b2e343572f91","Type":"ContainerDied","Data":"70f14e6a098aebcce10279740b773e23e59f02a9b2df9fe6b8bba622b7b92a46"} Apr 21 10:14:39.145169 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.145104 2573 scope.go:117] "RemoveContainer" containerID="fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410" Apr 21 10:14:39.145169 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.145064 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts" Apr 21 10:14:39.154459 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.154438 2573 scope.go:117] "RemoveContainer" containerID="fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410" Apr 21 10:14:39.154740 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:14:39.154716 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410\": container with ID starting with fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410 not found: ID does not exist" containerID="fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410" Apr 21 10:14:39.154794 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.154752 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410"} err="failed to get container status \"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410\": rpc error: code = NotFound desc = could not find container \"fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410\": container with ID starting with fd0f52400d84e46b902275a490fe4f7e237b546f064b8642b2aabd8c7d965410 not found: ID does not exist" Apr 21 10:14:39.165277 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.165255 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:39.168527 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.168508 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-rvhts"] Apr 21 10:14:39.195713 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.195688 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-clx99"] Apr 21 10:14:39.196021 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.196010 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" containerName="seaweedfs-tls-custom" Apr 21 10:14:39.196063 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.196023 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" containerName="seaweedfs-tls-custom" Apr 21 10:14:39.196100 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.196087 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" containerName="seaweedfs-tls-custom" Apr 21 10:14:39.200563 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.200523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.202957 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.202941 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 21 10:14:39.203157 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.203144 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mv2mf\"" Apr 21 10:14:39.203388 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.203340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 21 10:14:39.205887 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.205867 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-clx99"] Apr 21 10:14:39.294982 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.294945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c5fcab5-5912-4c82-832f-5da7d4b6460b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.295412 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.295020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.295412 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.295065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4kjw\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-kube-api-access-k4kjw\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.395606 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.395506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c5fcab5-5912-4c82-832f-5da7d4b6460b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.395606 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.395595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.395790 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.395617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4kjw\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-kube-api-access-k4kjw\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.395879 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.395859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c5fcab5-5912-4c82-832f-5da7d4b6460b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.397951 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.397923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.405789 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.405768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4kjw\" (UniqueName: \"kubernetes.io/projected/0c5fcab5-5912-4c82-832f-5da7d4b6460b-kube-api-access-k4kjw\") pod \"seaweedfs-tls-custom-5c88b85bb7-clx99\" (UID: \"0c5fcab5-5912-4c82-832f-5da7d4b6460b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.511500 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.511459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" Apr 21 10:14:39.629984 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:39.629956 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-clx99"] Apr 21 10:14:39.632451 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:14:39.632422 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5fcab5_5912_4c82_832f_5da7d4b6460b.slice/crio-d42d392e38de25e7c3a6a5dea72a46cf4a1334991de7cc7aece66b2ff3e147df WatchSource:0}: Error finding container d42d392e38de25e7c3a6a5dea72a46cf4a1334991de7cc7aece66b2ff3e147df: Status 404 returned error can't find the container with id d42d392e38de25e7c3a6a5dea72a46cf4a1334991de7cc7aece66b2ff3e147df Apr 21 10:14:40.150438 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.150351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" event={"ID":"0c5fcab5-5912-4c82-832f-5da7d4b6460b","Type":"ContainerStarted","Data":"bd58bbbeb0cc716a44eaa693e8ca4dd6fea946963164abcbc52e802a9699e2aa"} Apr 21 10:14:40.150438 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.150384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" event={"ID":"0c5fcab5-5912-4c82-832f-5da7d4b6460b","Type":"ContainerStarted","Data":"d42d392e38de25e7c3a6a5dea72a46cf4a1334991de7cc7aece66b2ff3e147df"} Apr 21 10:14:40.166707 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.166658 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-clx99" podStartSLOduration=0.910357734 podStartE2EDuration="1.166642403s" podCreationTimestamp="2026-04-21 10:14:39 +0000 UTC" firstStartedPulling="2026-04-21 10:14:39.633646936 +0000 UTC m=+658.004447623" lastFinishedPulling="2026-04-21 10:14:39.889931605 +0000 UTC m=+658.260732292" observedRunningTime="2026-04-21 10:14:40.165052732 +0000 UTC m=+658.535853447" watchObservedRunningTime="2026-04-21 10:14:40.166642403 +0000 UTC m=+658.537443112" Apr 21 10:14:40.192124 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.192093 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7e3163-f7b1-4cdb-a2b2-b2e343572f91" path="/var/lib/kubelet/pods/ba7e3163-f7b1-4cdb-a2b2-b2e343572f91/volumes" Apr 21 10:14:40.467479 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.467397 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-w88v9"] Apr 21 10:14:40.470789 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.470770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:40.477059 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.477031 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-w88v9"] Apr 21 10:14:40.503779 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.503739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz6jx\" (UniqueName: \"kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx\") pod \"s3-tls-init-custom-w88v9\" (UID: \"b528c15b-ffd2-48b5-9204-df01b43e33c4\") " pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:40.604856 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.604824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz6jx\" (UniqueName: \"kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx\") pod \"s3-tls-init-custom-w88v9\" (UID: \"b528c15b-ffd2-48b5-9204-df01b43e33c4\") " pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:40.613265 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.613229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz6jx\" (UniqueName: \"kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx\") pod \"s3-tls-init-custom-w88v9\" (UID: \"b528c15b-ffd2-48b5-9204-df01b43e33c4\") " pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:40.791058 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.791024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:40.909499 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:40.909472 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-w88v9"] Apr 21 10:14:40.911466 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:14:40.911441 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb528c15b_ffd2_48b5_9204_df01b43e33c4.slice/crio-2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb WatchSource:0}: Error finding container 2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb: Status 404 returned error can't find the container with id 2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb Apr 21 10:14:41.155171 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:41.155064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w88v9" event={"ID":"b528c15b-ffd2-48b5-9204-df01b43e33c4","Type":"ContainerStarted","Data":"db25974669073e9febfe03919cc97b6409411b99d12550bba7973596e42c9d3a"} Apr 21 10:14:41.155171 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:41.155106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w88v9" event={"ID":"b528c15b-ffd2-48b5-9204-df01b43e33c4","Type":"ContainerStarted","Data":"2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb"} Apr 21 10:14:41.170236 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:41.170170 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-w88v9" podStartSLOduration=1.170149284 podStartE2EDuration="1.170149284s" podCreationTimestamp="2026-04-21 10:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:14:41.168968761 +0000 UTC m=+659.539769471" watchObservedRunningTime="2026-04-21 10:14:41.170149284 +0000 UTC m=+659.540949994" Apr 21 10:14:46.170468 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:46.170432 2573 generic.go:358] "Generic (PLEG): container finished" podID="b528c15b-ffd2-48b5-9204-df01b43e33c4" containerID="db25974669073e9febfe03919cc97b6409411b99d12550bba7973596e42c9d3a" exitCode=0 Apr 21 10:14:46.170878 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:46.170503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w88v9" event={"ID":"b528c15b-ffd2-48b5-9204-df01b43e33c4","Type":"ContainerDied","Data":"db25974669073e9febfe03919cc97b6409411b99d12550bba7973596e42c9d3a"} Apr 21 10:14:47.307518 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:47.307489 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:47.365650 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:47.365609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz6jx\" (UniqueName: \"kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx\") pod \"b528c15b-ffd2-48b5-9204-df01b43e33c4\" (UID: \"b528c15b-ffd2-48b5-9204-df01b43e33c4\") " Apr 21 10:14:47.367721 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:47.367693 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx" (OuterVolumeSpecName: "kube-api-access-vz6jx") pod "b528c15b-ffd2-48b5-9204-df01b43e33c4" (UID: "b528c15b-ffd2-48b5-9204-df01b43e33c4"). InnerVolumeSpecName "kube-api-access-vz6jx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:14:47.466248 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:47.466158 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vz6jx\" (UniqueName: \"kubernetes.io/projected/b528c15b-ffd2-48b5-9204-df01b43e33c4-kube-api-access-vz6jx\") on node \"ip-10-0-140-234.ec2.internal\" DevicePath \"\"" Apr 21 10:14:48.177376 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:48.177348 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-w88v9" Apr 21 10:14:48.177554 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:48.177346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-w88v9" event={"ID":"b528c15b-ffd2-48b5-9204-df01b43e33c4","Type":"ContainerDied","Data":"2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb"} Apr 21 10:14:48.177554 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:14:48.177456 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc957f8466564a42b04156be74a4cfe301b56ed7271c1f98fb396162afbcdcb" Apr 21 10:18:24.850289 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.850203 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:18:24.850807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.850600 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b528c15b-ffd2-48b5-9204-df01b43e33c4" containerName="s3-tls-init-custom" Apr 21 10:18:24.850807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.850617 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b528c15b-ffd2-48b5-9204-df01b43e33c4" containerName="s3-tls-init-custom" Apr 21 10:18:24.850807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.850704 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b528c15b-ffd2-48b5-9204-df01b43e33c4" containerName="s3-tls-init-custom" Apr 21 10:18:24.853861 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.853837 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:18:24.856559 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.856522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjlcm\"" Apr 21 10:18:24.861512 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.861478 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:18:24.888821 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:24.888796 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:18:25.024301 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:25.024266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:18:25.026082 ip-10-0-140-234 kubenswrapper[2573]: W0421 10:18:25.026053 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod850e4b1c_78fc_4df4_84e6_90f549a4c760.slice/crio-15d9c87a339ec03dcb137f0bb4ebae28dd1fb3fd18eca267e6892f4250d454fa WatchSource:0}: Error finding container 15d9c87a339ec03dcb137f0bb4ebae28dd1fb3fd18eca267e6892f4250d454fa: Status 404 returned error can't find the container with id 15d9c87a339ec03dcb137f0bb4ebae28dd1fb3fd18eca267e6892f4250d454fa Apr 21 10:18:25.027918 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:25.027900 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:18:25.822447 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:25.822397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" event={"ID":"850e4b1c-78fc-4df4-84e6-90f549a4c760","Type":"ContainerStarted","Data":"15d9c87a339ec03dcb137f0bb4ebae28dd1fb3fd18eca267e6892f4250d454fa"} Apr 21 10:18:26.827130 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:26.827092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" event={"ID":"850e4b1c-78fc-4df4-84e6-90f549a4c760","Type":"ContainerStarted","Data":"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9"} Apr 21 10:18:26.827518 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:26.827235 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:18:26.828832 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:26.828808 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:18:26.843449 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:26.843395 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" podStartSLOduration=1.816269464 podStartE2EDuration="2.843378366s" podCreationTimestamp="2026-04-21 10:18:24 +0000 UTC" firstStartedPulling="2026-04-21 10:18:25.028062131 +0000 UTC m=+883.398862817" lastFinishedPulling="2026-04-21 10:18:26.055171032 +0000 UTC m=+884.425971719" observedRunningTime="2026-04-21 10:18:26.841322493 +0000 UTC m=+885.212123214" watchObservedRunningTime="2026-04-21 10:18:26.843378366 +0000 UTC m=+885.214179075" Apr 21 10:18:42.194258 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:42.194226 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:18:42.195760 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:18:42.195739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:19:59.930222 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:19:59.930187 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-hcj9l_850e4b1c-78fc-4df4-84e6-90f549a4c760/kserve-container/0.log" Apr 21 10:20:00.194092 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:00.194005 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:20:00.194254 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:00.194232 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" podUID="850e4b1c-78fc-4df4-84e6-90f549a4c760" containerName="kserve-container" containerID="cri-o://66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9" gracePeriod=30 Apr 21 10:20:00.422754 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:00.422727 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:20:01.123048 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.123010 2573 generic.go:358] "Generic (PLEG): container finished" podID="850e4b1c-78fc-4df4-84e6-90f549a4c760" containerID="66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9" exitCode=2 Apr 21 10:20:01.123495 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.123068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" event={"ID":"850e4b1c-78fc-4df4-84e6-90f549a4c760","Type":"ContainerDied","Data":"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9"} Apr 21 10:20:01.123495 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.123074 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" Apr 21 10:20:01.123495 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.123096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l" event={"ID":"850e4b1c-78fc-4df4-84e6-90f549a4c760","Type":"ContainerDied","Data":"15d9c87a339ec03dcb137f0bb4ebae28dd1fb3fd18eca267e6892f4250d454fa"} Apr 21 10:20:01.123495 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.123113 2573 scope.go:117] "RemoveContainer" containerID="66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9" Apr 21 10:20:01.131303 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.131285 2573 scope.go:117] "RemoveContainer" containerID="66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9" Apr 21 10:20:01.131591 ip-10-0-140-234 kubenswrapper[2573]: E0421 10:20:01.131572 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9\": container with ID starting with 66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9 not found: ID does not exist" containerID="66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9" Apr 21 10:20:01.131644 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.131602 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9"} err="failed to get container status \"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9\": rpc error: code = NotFound desc = could not find container \"66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9\": container with ID starting with 66630593bd9c050092bc6f6ac389ebbefcaba81199ce551e38c8c5a7b8d259a9 not found: ID does not exist" Apr 21 10:20:01.142807 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.142781 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:20:01.147570 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:01.146892 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-hcj9l"] Apr 21 10:20:02.191208 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:20:02.191170 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850e4b1c-78fc-4df4-84e6-90f549a4c760" path="/var/lib/kubelet/pods/850e4b1c-78fc-4df4-84e6-90f549a4c760/volumes" Apr 21 10:23:42.216373 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:23:42.216343 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:23:42.219111 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:23:42.219085 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:28:42.239670 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:28:42.239636 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:28:42.242431 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:28:42.242407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:33:42.262730 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:33:42.262698 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:33:42.265727 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:33:42.265703 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:38:42.284172 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:38:42.284142 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:38:42.288913 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:38:42.288891 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:43:42.308680 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:43:42.308655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:43:42.314066 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:43:42.314048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:48:42.331966 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:48:42.331886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:48:42.338014 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:48:42.337990 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:53:42.360154 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:53:42.360123 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:53:42.366134 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:53:42.366114 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:58:42.383582 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:58:42.383553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 10:58:42.389374 ip-10-0-140-234 kubenswrapper[2573]: I0421 10:58:42.389350 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:03:42.413156 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:03:42.413050 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:03:42.419970 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:03:42.417300 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:08:42.435940 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:08:42.435832 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:08:42.440731 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:08:42.440706 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:10:20.416161 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.416129 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x9mhh/must-gather-9kj2m"] Apr 21 11:10:20.416886 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.416513 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="850e4b1c-78fc-4df4-84e6-90f549a4c760" containerName="kserve-container" Apr 21 11:10:20.416886 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.416528 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="850e4b1c-78fc-4df4-84e6-90f549a4c760" containerName="kserve-container" Apr 21 11:10:20.416886 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.416613 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="850e4b1c-78fc-4df4-84e6-90f549a4c760" containerName="kserve-container" Apr 21 11:10:20.419809 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.419790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.422706 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.422683 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"kube-root-ca.crt\"" Apr 21 11:10:20.423702 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.423684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x9mhh\"/\"default-dockercfg-bdbtk\"" Apr 21 11:10:20.423788 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.423702 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"openshift-service-ca.crt\"" Apr 21 11:10:20.426867 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.426846 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/must-gather-9kj2m"] Apr 21 11:10:20.538599 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.538557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjlz\" (UniqueName: \"kubernetes.io/projected/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-kube-api-access-tfjlz\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.538770 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.538669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-must-gather-output\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.639644 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.639614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-must-gather-output\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.639774 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.639668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjlz\" (UniqueName: \"kubernetes.io/projected/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-kube-api-access-tfjlz\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.639988 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.639953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-must-gather-output\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.648037 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.648008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjlz\" (UniqueName: \"kubernetes.io/projected/0237ef6b-f3ee-43a5-af56-dd9541eb1a67-kube-api-access-tfjlz\") pod \"must-gather-9kj2m\" (UID: \"0237ef6b-f3ee-43a5-af56-dd9541eb1a67\") " pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.737821 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.737738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" Apr 21 11:10:20.856018 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.855993 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/must-gather-9kj2m"] Apr 21 11:10:20.858677 ip-10-0-140-234 kubenswrapper[2573]: W0421 11:10:20.858646 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0237ef6b_f3ee_43a5_af56_dd9541eb1a67.slice/crio-18b1a2c3cb620652bbf2899d695cc0b712bd4dac4aef866b2ca3db2c37f9138a WatchSource:0}: Error finding container 18b1a2c3cb620652bbf2899d695cc0b712bd4dac4aef866b2ca3db2c37f9138a: Status 404 returned error can't find the container with id 18b1a2c3cb620652bbf2899d695cc0b712bd4dac4aef866b2ca3db2c37f9138a Apr 21 11:10:20.860343 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:20.860326 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 11:10:21.072611 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:21.072575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" event={"ID":"0237ef6b-f3ee-43a5-af56-dd9541eb1a67","Type":"ContainerStarted","Data":"18b1a2c3cb620652bbf2899d695cc0b712bd4dac4aef866b2ca3db2c37f9138a"} Apr 21 11:10:22.078298 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:22.078258 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" event={"ID":"0237ef6b-f3ee-43a5-af56-dd9541eb1a67","Type":"ContainerStarted","Data":"2aac6473b476fe9cd19a015626d126fdfa3c523a6969bec00362001d6d30b1b9"} Apr 21 11:10:22.078298 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:22.078303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" event={"ID":"0237ef6b-f3ee-43a5-af56-dd9541eb1a67","Type":"ContainerStarted","Data":"5892d98febaeeffc5c3ade045ba521d645d4bfeb3417ddcdc580cbf6264c18a5"} Apr 21 11:10:22.095391 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:22.095344 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x9mhh/must-gather-9kj2m" podStartSLOduration=1.290785573 podStartE2EDuration="2.095329182s" podCreationTimestamp="2026-04-21 11:10:20 +0000 UTC" firstStartedPulling="2026-04-21 11:10:20.860449711 +0000 UTC m=+3999.231250397" lastFinishedPulling="2026-04-21 11:10:21.664993316 +0000 UTC m=+4000.035794006" observedRunningTime="2026-04-21 11:10:22.093753332 +0000 UTC m=+4000.464554043" watchObservedRunningTime="2026-04-21 11:10:22.095329182 +0000 UTC m=+4000.466129891" Apr 21 11:10:23.203118 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:23.203081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zjd9r_b16969ed-ca80-4fde-b8cb-9e1cbd9d131d/global-pull-secret-syncer/0.log" Apr 21 11:10:23.269247 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:23.269193 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-27nvt_736b14db-747c-43b4-bbb6-55ccc6a8a3d8/konnectivity-agent/0.log" Apr 21 11:10:23.419153 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:23.419110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-234.ec2.internal_0cbe835788d88cf79d40a6c28376b21d/haproxy/0.log" Apr 21 11:10:26.895522 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:26.895473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/alertmanager/0.log" Apr 21 11:10:26.924277 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:26.924230 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/config-reloader/0.log" Apr 21 11:10:26.952011 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:26.951975 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/kube-rbac-proxy-web/0.log" Apr 21 11:10:26.975117 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:26.975084 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/kube-rbac-proxy/0.log" Apr 21 11:10:26.998731 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:26.998696 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/kube-rbac-proxy-metric/0.log" Apr 21 11:10:27.021784 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.021757 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/prom-label-proxy/0.log" Apr 21 11:10:27.046425 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.046390 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1ad7e63d-69fd-4355-afcb-eb1adfe55dc4/init-config-reloader/0.log" Apr 21 11:10:27.116361 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.116328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v6kj4_fb46e263-367f-4628-8a44-6e443f2c276d/kube-state-metrics/0.log" Apr 21 11:10:27.140829 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.140743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v6kj4_fb46e263-367f-4628-8a44-6e443f2c276d/kube-rbac-proxy-main/0.log" Apr 21 11:10:27.165968 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.165936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v6kj4_fb46e263-367f-4628-8a44-6e443f2c276d/kube-rbac-proxy-self/0.log" Apr 21 11:10:27.196339 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.196307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-fb55fcdd5-67vqq_049e9c77-bac7-43db-b890-f3604cc9398b/metrics-server/0.log" Apr 21 11:10:27.325338 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.325307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wd84q_d5192e80-1d1c-45a0-9dea-817499443dd0/node-exporter/0.log" Apr 21 11:10:27.346042 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.346002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wd84q_d5192e80-1d1c-45a0-9dea-817499443dd0/kube-rbac-proxy/0.log" Apr 21 11:10:27.369859 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.369827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wd84q_d5192e80-1d1c-45a0-9dea-817499443dd0/init-textfile/0.log" Apr 21 11:10:27.579505 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.579452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/prometheus/0.log" Apr 21 11:10:27.601353 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.601299 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/config-reloader/0.log" Apr 21 11:10:27.624855 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.624825 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/thanos-sidecar/0.log" Apr 21 11:10:27.646948 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.646917 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/kube-rbac-proxy-web/0.log" Apr 21 11:10:27.672587 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.672559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/kube-rbac-proxy/0.log" Apr 21 11:10:27.695145 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.695113 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/kube-rbac-proxy-thanos/0.log" Apr 21 11:10:27.716518 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.716478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_92ac8451-a21e-41d5-bd03-d1daebae4cdf/init-config-reloader/0.log" Apr 21 11:10:27.790262 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.790226 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-lhrt8_821ef817-ee5c-486f-9814-da4d97b80753/prometheus-operator-admission-webhook/0.log" Apr 21 11:10:27.817166 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.817135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b4b858b84-8kgml_294e1d79-5934-4d4d-b1c2-65706662f756/telemeter-client/0.log" Apr 21 11:10:27.839094 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.839012 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b4b858b84-8kgml_294e1d79-5934-4d4d-b1c2-65706662f756/reload/0.log" Apr 21 11:10:27.863676 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:27.863644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b4b858b84-8kgml_294e1d79-5934-4d4d-b1c2-65706662f756/kube-rbac-proxy/0.log" Apr 21 11:10:29.129614 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.129583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9gkbz_1e5525d0-e4ac-4ad3-8e4a-af53759f4ae0/networking-console-plugin/0.log" Apr 21 11:10:29.815672 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.815642 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq"] Apr 21 11:10:29.820347 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.820323 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:29.825933 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.825907 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq"] Apr 21 11:10:29.931677 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.931639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54dx\" (UniqueName: \"kubernetes.io/projected/eec708df-7c50-4f2b-8bcb-d13e1c35a969-kube-api-access-c54dx\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:29.931833 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.931690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-lib-modules\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:29.931833 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.931802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-sys\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:29.931905 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.931861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-podres\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:29.931943 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:29.931910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-proc\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033117 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-lib-modules\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033285 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-sys\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033285 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-podres\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033285 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-proc\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033285 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-proc\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033285 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-sys\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033488 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c54dx\" (UniqueName: \"kubernetes.io/projected/eec708df-7c50-4f2b-8bcb-d13e1c35a969-kube-api-access-c54dx\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033488 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-podres\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.033488 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.033269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eec708df-7c50-4f2b-8bcb-d13e1c35a969-lib-modules\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.041228 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.041207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54dx\" (UniqueName: \"kubernetes.io/projected/eec708df-7c50-4f2b-8bcb-d13e1c35a969-kube-api-access-c54dx\") pod \"perf-node-gather-daemonset-wcpvq\" (UID: \"eec708df-7c50-4f2b-8bcb-d13e1c35a969\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.133765 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.133686 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:30.269092 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:30.269037 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq"] Apr 21 11:10:30.272987 ip-10-0-140-234 kubenswrapper[2573]: W0421 11:10:30.272935 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeec708df_7c50_4f2b_8bcb_d13e1c35a969.slice/crio-118e439d7ae183126e3729e16633ac86335f6eeb68df60c76e92ebf414ab2bd3 WatchSource:0}: Error finding container 118e439d7ae183126e3729e16633ac86335f6eeb68df60c76e92ebf414ab2bd3: Status 404 returned error can't find the container with id 118e439d7ae183126e3729e16633ac86335f6eeb68df60c76e92ebf414ab2bd3 Apr 21 11:10:31.047857 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.047827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bqc7r_2a4340c5-5a53-4cd3-b487-d469b4bb82c5/dns/0.log" Apr 21 11:10:31.071157 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.071131 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bqc7r_2a4340c5-5a53-4cd3-b487-d469b4bb82c5/kube-rbac-proxy/0.log" Apr 21 11:10:31.111646 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.111607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" event={"ID":"eec708df-7c50-4f2b-8bcb-d13e1c35a969","Type":"ContainerStarted","Data":"e60c95ab5228886508f9ac50f7e0164b995d61f7ec4b77e02b6b13a3ed726eca"} Apr 21 11:10:31.111646 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.111649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" event={"ID":"eec708df-7c50-4f2b-8bcb-d13e1c35a969","Type":"ContainerStarted","Data":"118e439d7ae183126e3729e16633ac86335f6eeb68df60c76e92ebf414ab2bd3"} Apr 21 11:10:31.111828 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.111759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:31.152173 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.152112 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-594w4_0622ef89-a9c2-4672-891f-4e52ebb096b4/dns-node-resolver/0.log" Apr 21 11:10:31.661048 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:31.661021 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ffsc5_b9feaa36-784e-406f-b11b-9f103755a6a0/node-ca/0.log" Apr 21 11:10:32.687752 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:32.687708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cd55l_dcf4e3a0-23bd-4b3c-a708-c10c960ca6c1/serve-healthcheck-canary/0.log" Apr 21 11:10:33.115866 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:33.115838 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8npvl_1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e/kube-rbac-proxy/0.log" Apr 21 11:10:33.136816 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:33.136792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8npvl_1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e/exporter/0.log" Apr 21 11:10:33.158883 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:33.158851 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8npvl_1c1cffa6-d9b7-4ec6-8295-0fe0de45a40e/extractor/0.log" Apr 21 11:10:35.322471 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:35.322432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-mdh49_79736023-3e01-47ad-bef8-701841fb35ab/manager/0.log" Apr 21 11:10:35.594284 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:35.594214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-k2gbb_88ae03a6-c387-4295-8629-d80af06c998b/s3-init/0.log" Apr 21 11:10:35.625511 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:35.625476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-w88v9_b528c15b-ffd2-48b5-9204-df01b43e33c4/s3-tls-init-custom/0.log" Apr 21 11:10:35.702749 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:35.702712 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-clx99_0c5fcab5-5912-4c82-832f-5da7d4b6460b/seaweedfs-tls-custom/0.log" Apr 21 11:10:37.124901 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:37.124871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" Apr 21 11:10:37.143167 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:37.143110 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-wcpvq" podStartSLOduration=8.143096086 podStartE2EDuration="8.143096086s" podCreationTimestamp="2026-04-21 11:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:10:31.130293083 +0000 UTC m=+4009.501093793" watchObservedRunningTime="2026-04-21 11:10:37.143096086 +0000 UTC m=+4015.513896795" Apr 21 11:10:40.004695 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:40.004611 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jwp2j_087fe162-4bd2-4285-92a8-117f3a58caa3/kube-storage-version-migrator-operator/1.log" Apr 21 11:10:40.006767 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:40.006740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jwp2j_087fe162-4bd2-4285-92a8-117f3a58caa3/kube-storage-version-migrator-operator/0.log" Apr 21 11:10:40.970357 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:40.970279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-27bm5_5d0fa637-dd7c-4b7c-b273-afeb822c11b6/kube-multus/0.log" Apr 21 11:10:41.183765 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.183733 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/kube-multus-additional-cni-plugins/0.log" Apr 21 11:10:41.203740 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.203716 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/egress-router-binary-copy/0.log" Apr 21 11:10:41.229716 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.229644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/cni-plugins/0.log" Apr 21 11:10:41.251589 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.251559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/bond-cni-plugin/0.log" Apr 21 11:10:41.273402 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.273373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/routeoverride-cni/0.log" Apr 21 11:10:41.298677 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.298646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/whereabouts-cni-bincopy/0.log" Apr 21 11:10:41.320624 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.320598 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzkgn_5ee61994-bf42-4dfb-8334-fb990a0f5d8f/whereabouts-cni/0.log" Apr 21 11:10:41.608677 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.608646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g9jwm_a931daa8-594d-442d-b462-5f77532314a5/network-metrics-daemon/0.log" Apr 21 11:10:41.631210 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:41.631180 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g9jwm_a931daa8-594d-442d-b462-5f77532314a5/kube-rbac-proxy/0.log" Apr 21 11:10:42.870219 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:42.870186 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-controller/0.log" Apr 21 11:10:42.891355 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:42.891319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/0.log" Apr 21 11:10:42.926993 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:42.926962 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovn-acl-logging/1.log" Apr 21 11:10:42.991649 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:42.991614 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/kube-rbac-proxy-node/0.log" Apr 21 11:10:43.054174 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:43.054148 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 11:10:43.074232 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:43.074197 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/northd/0.log" Apr 21 11:10:43.097385 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:43.097356 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/nbdb/0.log" Apr 21 11:10:43.120747 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:43.120681 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/sbdb/0.log" Apr 21 11:10:43.322622 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:43.322586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nz95q_c6a94c5c-b7cd-4e43-9d1e-59ac152bc150/ovnkube-controller/0.log" Apr 21 11:10:44.621950 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:44.621913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9n5rr_4f2983b7-be09-42ac-b5a7-0c43883354da/network-check-target-container/0.log" Apr 21 11:10:45.576160 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:45.576101 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wg6rg_d93f69ab-ab85-442f-bac7-c3bcf5b11b8e/iptables-alerter/0.log" Apr 21 11:10:46.217419 ip-10-0-140-234 kubenswrapper[2573]: I0421 11:10:46.217386 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9j9bx_0e4294b2-e56e-4ceb-bbc1-4ab1d38dc27b/tuned/0.log"